Publikationen von apl. Prof. Dr.-Ing. habil. Matthias Becker

2023

  • To which Extent are Simulation Research Papers related to the Real World? – A Survey on the Use of Validation Methods
    Anne Vonderheide and Matthias Becker
    Proceedings of the 21st annual Industrial Simulation Conference - ISC '23
    Validation is the methodology at the end of the simulation and modelling cycle that relates the insights gained by computer calculations to the real world scenario, that is subject of study. Validation establishes credibility concerning the simulation results and implications with respect to the real system. Validation is a very important aspect in real world simulation studies, since important financial and security related decisions concerning the planning, design and optimization of the real systems might be relied. In our study, we conducted a survey on research papers in renowned simulation conferences and evaluated whether validation had been used at all, and if yes, which validation methods had been used to which extent. This reveals the relevance and trust of simulation results of the research papers concerning the real world problem addressed in that research. We found that a majority of research papers neglect the application of validation methodology. Obviously, the validity of the results of simulation studies or the impact on practical problems in the real world seems to be questionable in a vast number of cases.
  • A Study of Insect Management Models in Agriculture
    Matthias Becker and Kin-Woon Yeow
    Proceedings of the 21st annual Industrial Simulation Conference - ISC '23
    In the study of controlling the insect population within the field of agriculture, many simulation models on herbivorous insects have been proposed. These proposed models contains various factors that determine the population development and growth of the herbivorous insects, such as temperature, light intensity and etc. Due to these factors, intensive studies have been carried out to discover the best model that suits the population development for each specific species of herbivorous insect.
  • Training Agents for Unknown Logistics Problems
    Elisa Schmidt and Matthias Becker
    Proceedings of the Companion Conference on Genetic and Evolutionary Computation - GECCO '23 Companion
    A methodology on how to prepare agents to succeed on a priori unknown logistics problems is presented. The training of the agents is and can only be executed using a small number of test problems that are taken out of a broad class of generalized logistics problems. The developed agents are then evaluated on unknown instances of the problem class. This work has been developed in the context of last year's AbstractSwarm Multi-Agent Logistics Competition. The most successful algorithms are presented, and additionally, all participating algorithms are discussed with respect to the features of the algorithms that contribute to their success. As a result, we conclude that such a broad variety of a priori unknown logistics problems can be solved efficiently if multiple different good working approaches are used, instead of trying to find one optimal algorithm. For the used test problems this method can undercut, trivial as well as non-trivial implementations, for example, algorithms based on machine learning.

2021

2020

2019

2018

2017

  • Indoor Positioning Solely Based on User's Sight
    Matthias Becker
    Information Science and Applications (ICISA) 2017, Lecture Notes in Electrical Engineering (LNEE) Series - ICISA '17
    Determination of the absolute geographical position has become every day routine, using the Global Positioning System (GPS), despite the prior existence of maps. However no equally universal solution has been developed for determining one’s location inside a building, which is an equally relevant problem statement, for which GPS cannot be used. Existing solutions usually involve additional infrastructure on the end of the location provider, such as beacon installations or particular configurations of wireless access points. These solutions are generally facilitated by additional native mobile applications on the client device, which connect to this infrastructure. We are aware of such solutions, but believe these to be lacking in simplicity. Our approach for indoor positioning alleviates the necessity for additional hardware by the provider, and software installation by the user. We propose to determine the user’s position inside a building using only a photo of the corridor visible to the user, uploading it to a local positioning server, accessible using a browser, which performs a classification of the photo based on a Neural Network approach. Our results prove the feasibility of our approach. One floor of the university’s building with partially very similar corridors has been learned by a deep convolutional neural network. A person lost in the building simply accesses the positioning server’s website and uploads a photo of his current line of sight. The server responds by generating and displaying a map of the building with the user’s current position and current direction.
  • Implementing Real-Life Indoor Positioning Systems Using Machine Learning Approaches
    Matthias Becker and Bharat Ahuja
    IEEE 8th International Conference on Information, Intelligence, Systems, Applications - IISA '17
    Position determination in an indoor environment has become a widely discussed problem, due to the growing complexity of building layouts and the lack of any natural heuristics for orientation as compared to the case outdoors. Additionally there is no universal standard for indoor positioning, such as GPS, which however cannot be used for this purpose. Locating oneself in a building serves an increasingly vital function, especially in time-critical scenarios such as airports etc. The use of expensive hardware may assist in solving this problem, which has been studied thoroughly with different technologies being used to achieve a precision of within a few meters. Nevertheless these methods have remained in the academic realm for the most part. This is largely due to the high costs and labour of such hardware installations and the construction of software to interpret the measurements. The goal of this paper is to use existing wireless LAN access points in a building and user-provided smartphones to create a cost-effective positioning system, by omitting the labour and cost of altering building infrastructure, and at the same time simplifying the construction of classifiers for real-life use-cases. An alternative approach using image recognition techniques is presented, for a purely web-based solution.
  • Estimating performance of large scale distributed simulation built on homogeneous hardware
    Desheng Fu, Matthias Becker, Marcus O'Connor and Helena Szczerbicka
    2017 IEEE/ACM 21st International Symposium on Distributed Simulation and Real Time Applications - DS-RT '17
    Large scale distributed simulation should be well planned before the execution, since applying unnecessary hardware only wastes our time and money. On the other side, we need enough hardware to achieve an acceptable performance. Thus, it is considerable to estimate the performance of a large scale distributed simulation before the execution. Such an estimation also improves the efficiency of the applied hardware in many cases due to the optimization on the simulation algorithm and on the partition of the model. In this paper, we show our approaches to estimate the performance, especially the duration of execution, of a large scale distributed simulation system built on a large set of homogeneous hardware, using a small set of hardware of the same type. Our basic idea is to simulate a distributed simulation in a sequential way for a short time considering all the costs and benefits of the distribution. The results of our case study show that our approaches are able to provide meaningful estimations.

2016

  • Planning in Dynamic, Distributed and Non-automatized Production Systems
    Matthias Becker, Michael L{\"u}tjen and Helena Szczerbicka
    Information Science and Applications (ICISA) 2016
    We present a framework for realization of a decentralized decision support system for deployment in production scenarios with a low level of automation, such as in multi-site construction. Furthermore we present the insights concerning the improvements of the production process when applying online simulation based optimization methods in that scenarios. Our solution shows how to realize a central control and decision support station with special focus on easy connection to the possible multiple construction sites and on high usability for nonexperts. Our framework is able to connect to a large number of modeling, simulation and analysis tools. For sake of usability it turned out, that only dialogue-based communication with the end user seems applicable in those scenarios, where often only simple devices are present.
  • Improving the performance of distributed discrete event simulation by exchange of conditional look-ahead
    Desheng Fu, Matthias Becker and Helena Szczerbicka
    Concurrency and Computation: Practice and Experience, Wiley 2016
    Distributed discrete event simulation is an important approach for enabling the modeling and analysis of the behavior of large systems. This approach presents a major problem, namely, the possible low performance due to the excessive overhead in synchronizing the distributed logical processes. To counter this, our approach to distributed discrete event simulation involves conservative synchronization and its acceleration using dynamic estimation of process-to-process look-ahead with a feedback mechanism. This mechanism allows for the estimation of a larger look-ahead, which may be invalidated and recalculated during the course of the simulation, if one of the processes obtains more detailed knowledge. In this work, we extend the dynamically estimated look-ahead, on the basis of the local state of the logical processes, by exchanging conditional look-aheads, in conjunction with the broadcast of invalidation announcements. A notable reduction in runtime in various cases is thus achieved, especially when the estimated look-ahead is stochastically too conservative.
  • Optimization of Tire Noise by Solving an Integer Linear Program (ILP)
    Matthias Becker, Nicolas Ginoux, Sebastien Martin and Zsuzsanna Roka
    IEEE International Conference on Systems Man and Cybernetics - SMC '16
    One important aim in tire industry when finalizing a tire design is the modeling of the noise characteristics as received by the passengers of the car. In previous works, the problem was studied using heuristic algorithms to minimize the noise by looking for a sequence under constraints. These constraints are imposed by tire industry. We present a new technique to compute the noise. We also propose an integer linear program based on that technique in order to solve this problem and find an optimal sequence. Our study shows that the integer linear programming approach shows significant improvement of the found tire designs, however it has to be improved further to meet the calculation time restrictions for real world problem size.
  • Basic Algorithms for Bee Hive Monitoring and Laser-based Mite Control
    Larissa Chazette, Matthias Becker and Helena Szczerbicka
    IEEE Symposium Series on Computational Intelligence - IEEE SSCI '16
    The work in progress described in this paper has the objective to implement a beehive monitoring system to monitor essential parameters of a bee hive (such as temperature, sound, weight) and additionally including an image recognition algorithm to observe the degree of infestation with Varroa mites. Mites should be detected at the entrance and statistics about the degree of infestation should be made available by a web interface. As ultimate approach to fight mites without chemicals the coordinates of the mites are to be detected and a laser will be used to kill them. This work describes approaches relevant to all steps of the aforementioned procedure, however it is still work in progress and the components of the approach still have to be integrated into one system that is deployable in practice.
  • Analysing the Cost-Efficiency of the Multi-agent Flood Algorithm in Search and Rescue Scenarios
    Florian Blatt, Matthias Becker and Helena Szczerbicka
    German Conference on Multiagent System Technologies
    A Multi-Agent algorithm works by using at least two agents to create a synergistic effect, resulting in an emergence of new possibilities which are not programmed implicitly into the various agents. To achieve this synergistic effect the algorithm has to provide the possibility to communicate and consecutively allow cooperation between the agents. Considering the use of multi-agent algorithms in search and rescue scenarios the targeted effect of the emergence is on one hand a more effective search and rescue process or on the other hand only an optimized rescue process. This paper examines the number of agents that is needed for the Multi-Agent Flood algorithm to yield the most beneficial ratio between the used number of agents and time it takes to complete the search process. Our studies show that adding more robots may not be cost efficient for the search and rescue process. This in turn allows for a better planning and coordination of robotic search teams, as the number of needed agents can be anticipated and the possible transport logistics of robots can be optimized.
  • On the quality of graphs generated by swarm algorithms
    Matthias Becker
    2016 IEEE Congress on Evolutionary Computation - CEC '16
    Swarm algorithms are often used for the generation of graphs. The generated graphs are mostly planar, inexpensive and fault-tolerant. In this work we evaluate the graphs produced by a swarm algorithm with respect to the properties and the quality of the found graphs and to the runtime requirements. The algorithm under consideration is essentially a simulation of the foraging of the slime mold Physarum polycephalum. Especially emphasized are the properties of the algorithm, since its deployment in many other works is limited to the existence of a graph solution, not reporting however the quality of the graph and runtime requirements of the algorithm. We compare the quality of the resulting graphs and the runtime of the algorithm to classical algorithms from graph theory. Our results show that the slime mold algorithm has some interesting features, however it is not the best means to construct graphs of large sets of nodes in terms of efficiency of the algorithm or quality of the outcome.

2015

2014

2013

  • A Multi-agent Flooding Algorithm for Search and Rescue Operations in Unknown Terrain
    Matthias Becker, Florian Blatt and Helena Szczerbicka
    Multiagent System Technologies
    In this paper we will introduce a new multi-agent algorithm for the use in search and rescue scenarios for exploration of unknown terrain. This method combines the concept of exploration from the flood algorithm and the path optimizing features of the ant algorithm. The first part leads to a fast exploration of the unknown terrain, while the second part constructs short paths from points of interest back to the base. Together this enables the starting of rescue operations parallel to the ongoing search. We demonstrate the feasibility of our approach by agent-based simulations. The simulations show, that our approach is comparable in speed and quality with already existing algorithms, delivering the additional benefit of short paths to points of interest, and adhering to the inherent limitations of these kind of scenarios.
  • Online simulation based decision support system for resource failure management in multi-site production environments
    Sebastian Bohlmann, Matthias Becker, Sinan Balci, Helena Szczerbicka and Eric Hund
    2013 IEEE 18th Conference on Emerging Technologies \& Factory Automation (ETFA)
    Planning in a multi-site, non-mass production environment is a special challenge because of several sources of uncertainty. Unlike in mass production facilities, in our setting the current state is not easily and exactly known when the case of re-planning occurs. The planning procedure has to contribute to that fact, as well as to further uncertainties concerning the effects of a plan when evaluating the plan. Thus in this work, we apply online simulation as means for re-planning multi-site production in the case of resource failure. This work is a first step where two alternatives are considered when a resource fails: either wait for repair of the resource, or transport another instance of this resource from another site, if there is more than one available. Our study shows that the planning using online simulation is superior to a static strategy such as `always wait for repair' or `always import resource' in case of resource failure.
  • On the potential of semi-conservative look-ahead estimation in approximative distributed discrete event simulation
    Desheng Fu, Matthias Becker and Helena Szczerbicka
    Proceedings of the 2013 Summer Computer Simulation Conference
    One major problem of distributed discrete event simulation is the poor performance due to the huge overhead for maintaining the order of causality, so that the execution time cannot be reduced significantly compared to sequential simulation. This holds especially when the processes are tightly coupled and the look-ahead is very short. On the other hand, results of many simulations are obtained from a number of independent outputs, which are of stochastic nature and a small deviation of a limited amount of outputs is acceptable. Acceptance of such deviations in a controlled way could affect a trade-off between the simulation accuracy and the execution time. The goal of our investigation is to develop a methodology to handle the trade-off. In this paper, we propose a new way of distributed simulation with semi-conservative look-ahead estimation, where we accept causality errors to a certain and limited extent. In our approach, we consider a semi-conservative estimation allowing limited over-estimation. If the look-ahead is over-estimated, unsolved causality errors will be resolved by a very efficient recovery procedure at the expense of simulation errors. Results from a case study demonstrate that our approach is able to maximize the look-ahead with respect to the predefined error bounds and can reduce the execution time of many simulations. We do however also point out the limitations of the mechanism and the trend of our further investigation.
  • Applicability of bio-inspired and graph-theoretic algorithms for the design of complex fault-tolerant graphs
    Matthias Becker, Florian Schmidt and Helena Szczerbicka
    2013 IEEE International Conference on Systems, Man, and Cybernetics
    Fault-tolerant networks are needed for many applications, such as telecommunication networks, electricity networks, traffic, routing, and others. Several methods for constructing fault-tolerant networks out of a given set of nodes exists, among them classical graph-theoretic ones, and recently also several bio-inspired algorithms have been proposed for such application. In this paper we study the performance of these different algorithms for that problem. Performance means that both the complexity of the algorithm for a given problem size and the quality of the generated networks are taken into account. We conclude that classical algorithms that belong to a certain complexity class are efficient for small to medium size problems, while at some point, for larger problems, bio-inspired solutions are more efficient to get a solution.

2012

  • Comparison of Bio-Inspired and Graph-Theoretic Algorithms for Design of Fault-Tolerant Networks
    Matthias Becker, Waraphan Sarasureeporn and Helena Szczerbicka
    ICAS 2012, The Eighth International Conference on Autonomic and Autonomous Systems
    Recently several approaches have been presented that exploit the ability of Physarum polycephalum to connect several food sources via a network of pipes in order to maintain an efficient food distribution inside the organism. These approaches use the mechanisms found in nature in order to solve a technical problem, namely the design of constructing faulttolerant and efficient connection networks. These works comprise experiments with a real slime mold Physarum polycephalum as well as computer simulations based on a tubular model and an agent-based approach. In this work, we study the suitability of those bio-inspired approaches and compare their performance to a graph-theoretic algorithm for construction of fault-tolerant connection networks, the (k, t)-spanner algorithm. The graphtheoretic algorithm is able to construct graphs with a certain degree of fault tolerance as well as meet a given maximal path length between two arbitrary nodes. However the definition of fault tolerance in previous bio-inspired works differs to that used in graph theory. Thus in our contribution we analyze the bio-inspired approaches as well as the graph-theoretic approach for their efficiency of designing optimal fault-tolerant graphs. We demonstrate the usability of the graph-theoretic approach despite relying on a different definition of fault tolerance. We conclude that classical efficient computational algorithms from graph theory can be adapted and applied in the same field as the bio-inspired approaches for the problem of constructing efficient fault tolerant networks. They often provide an easier to use and more direct solution than bio-inspired approaches, that need more parameter tuning before getting satisfactory results.
  • Agent-based Approaches for Exploration and Pathfinding in Unknown Environments
    Matthias Becker, Florian Blatt and Helena Szczerbicka
    17th IEEE International Conference on Emerging Technologies and Factory Automation
    This work evaluates and improves agent-based approaches for exploration of unknown terrain and finding (shortest) paths to a goal with an unknown location. This scenario is typically found in emergency incidents and for technically assisted search and rescue cases, e.g. when a building or a block of a city has been damaged by an earthquake and as a result, former maps are not valid anymore. In our approach we simulate the exploration of the unknown environment by agents that have different capabilities typically found in the technical assistance systems used in search and rescue operations, such as cameras, sensors, and limited communication abilities. The outcome of our simulation experiments indicates, which composition of a population of agents is best suited for efficient exploration of the unknown terrain, in terms of speed and path length. The results give hints which agent populations should be tried in a real world setting, in a search and rescue test scenario with real robots and sensors.
  • Orthogonal cut algorithm for value-based event localization in sensor networks
    Desheng Fu, Matthias Becker, Sven Schaust and Helena Szczerbicka
    2012 IEEE 9th International Conference on Mobile Ad-Hoc and Sensor Systems (MASS 2012)
    We investigate the capability of ad-hoc sensor networks equipped with simple sensor devices to enable a more accurate spatial event localization in order to support smart camera networks. We present a novel algorithm named Orthogonal Cut which is suitable for many different event detection scenarios, especially when large interferences occur. It estimates the spatial position of a detected event by dividing the surveillance space of a sensor network into smaller areas until a threshold criteria is met.

2011

2010

  • A simulation study of mechanisms of group selection of the slime mold Dictyostelium discoideum
    Matthias Becker
    2010 IEEE 14th International Conference on Intelligent Engineering Systems
    Slime molds are fascinating organisms, they can either live as an organism consisting out of a single cell (protozoa) or they can form a multi-cellular organism (pseudoplasmodium). So from the biological point of view, the slime molds are studied in order to understand the evolutionary step from a single cell organism to a multi-cellular organism. Studies have shown that the behavior of cooperating single cell organisms exhibit synergistic emergent intelligence, for example finding shortest paths. Just recently, simulation and experiments with a real slime mold (Physarum polycephalum) have been used for traveling salesman like problems. In this work we present a simulation model for the slime mold Dictyostelium discoideum. Different to other studies, here the whole life-cycle is modeled and simulated. This model is used to study the mechanism of cooperation of single cells: We compare the mechanism of altruistic group selection against individual and egoistic selection. It turns out that simple signaling mechanisms are sufficient for a complex behavior of Dictyostelium discoideum, that allows altruistic behavior on cell level, helping the whole population to survive.
  • A Data Management Framework Providing Online-Connectivity in Symbiotic Simulation
    Sebastian Bohlmann, Volkhard Klinger, Helena Szczerbicka and Matthias Becker
    24th EUROPEAN Conference on Modelling and Simulation (ECMS), Simulation meets Global Challenges, Kuala Lumpur, Malaysia
    Symbiotic simulation in industrial applications requires efficient connectivity between industrial processes and embedded legacy simulators. One main challenge is to handle heterogeneous system parameters like bandwidth, latency, redundancy, security and data representation. Moreover, data management needs to be improved in terms of unified access providing an interface to online, historical and corresponding simulation data. This paper proposes a framework for symbiotic simulation addressing the problem of connectivity. We introduce the Process Data Streaming Protocol (PDSP) managing distributed process data flows. Additionally we present PDSP based modules covering different modes of operation for data processing. The Framework interacts with distributed control systems via object linking and embedding for process control as well as to embedded systems using different hierarchic operation modes. Historical data can be provided transparently through an integrated stream-database. The framework is primarily optimized to be used in JAVA-based simulation environments, but is not limited to these. Finally we demonstrate the usability of the system while interacting with a simulation environment for a hybrid process and present some experimental results.
  • Simulation Model For The Whole Life Cycle Of The Slime Mold Dictyostelium Discoideum.
    Matthias Becker
    Proceedings of the European conference on modeling and simulation (ECMS)
    Slime molds are fascinating organisms, they can either live as an organism consisting out of a single cell or they can form a multi-cellular organism. Therefore from the biological point of view, the slime molds are studied in order to understand the evolutionary step from a single cell organism to a multi-cellular organism. Studies have shown that the behavior of cooperating single cell organisms exhibits synergistic emergent intelligence, for example finding shortest paths. Just recently, simulation and experiments with a real slime mold (Physarum polycephalum) have been used for traveling salesman like problems. In this work we present a simulation model for the slime mold Dictyostelium discoideum. Different to other studies, here the whole life-cycle is modeled and simulated. Very detailed behavioral patterns and parameters are modeled and as result a simulation model is obtained, that shows a behavior very close to the living slime mold. This result is consolidated by extensive verification experiments. As consequence, this model can be used to further study the mechanism of cooperation of single cells, mechanisms of synergy and emergence, and additionally this model offers the possibility to develop more slime mold inspired algorithms.
  • An Optimization Algorithm Similar to the Search of Food of the Slime Mold Dictyostelium Discoideum
    Matthias Becker and Malte Wegener
    IRAST International Congress on Computer Applications and Computational Science (

2009

  • Performance Efficiency Measuring and Prediction of Wafer Fabrication Operation with a Combined Clustering and Neural Network Approach
    Matthias Becker, Helena Szczerbicka and Mei-Chen Lo
    Asia Pacific Industrial Engineering and Management Systems Conference
    Most of the performance assessment of semiconductor manufacturer is based on their self-appraisal or subjective judgments. The needs to measure fabrication (fab) operation performance along with its various dimensions have led to the development of a large number of quantitative performance indicators. An overall scheme to measure the performance of fab operation involving multi-input and multi-effects (output) has not been well established yet. In this study, we approach the performance assessment and prediction by combining clustering approaches with Artificial Neural Networks (ANN) approaches. We use historical data from a Taiwan semiconductor major player which comprise input/investment data (such as headcount, salary, cost for machines, running the fab,...) as well as output of each fab (such as margin, waferoutput rate, stepmove, number of patents). The data comprise several years, during which some of the fabs have been ramped up. In the first phase of our approach, we studied several clustering algorithms (K-Means, X-means, Kernel K-Means, SIB, and EM) on the data. We found several clusterings that were meaningful according to human experts. One of the clustering approaches clearly divided one older fab from newer fabs, and also was able to distinguish fabs in ramping up phase from fabs that are in stable operation phase. Other approaches formed clusters according to the grade of performance (bad, medium-bad, medium-good, good) of the data sets. Second, we use the classification to let a neural network learn the status of a fab, so that for a new fab, the status can be judged by the neural net. In a third step, we let a neural network learn the relationship between the multiple inputs and outputs. As result we found a neural net structure that is able to predict changes in the inputs of a fab on the different output factors. By this we enable the fab management to obtain a prediction, which effect a planned measure (eg increase or decrease of headcount) has on the output of the fab, that is the performance figures.
  • Tread profile optimization for tires with multiple pitch tracks
    Matthias Becker, Sebastian Jaschke and Helena Szczerbicka
    Proceedings of the IEEE 13th international conference on Intelligent Engineering Systems
    Reduction of noise is a growing subject of interest in the automotive industry, especially in tire manufacturing. After construction of the basic tire design, that is design of the material and the basic building blocks called pitches, the last step in noise engineering of a tire is the determination of the pitch sequence of a tire. In this step the different types of pitches are put together regarding several constraints. Since there are a combinatorial number of valid pitch sequences, the goal is to find a valid pitch sequence with optimal noise characteristics. Due to the complexity of the problem, the globally optimal pitch sequence cannot be found by exhaustive search and intelligent algorithms such as Heuristic Optimization Algorithms, have to be used in order to find at least a locally optimal pitch sequence. Several successful approaches for this problem can be found in the literature for tires consisting out of just one pitch sequence. In this work tires consisting out of multiple pitch sequence (several tracks) are considered. We show how we can use algorithms for single track optimization and how we can combine them best for finding noise optimal tire designs for multiple pitch track tires.
  • On Classification Approaches for Misbehavior Detection in Wireless Sensor Networks
    Matthias Becker, Martin Drozda, Sven Schaust, Sebastian Bohlmann and Helena Szczerbicka
    Journal of Computers
    Adding security mechanisms to computer and communication systems without degrading their performance is a difficult task. This holds especially for wireless sensor networks, which due to their design are especially vulnerable to intrusion or attack. It is therefore important to find security mechanisms which deal with the limited resources of such systems in terms of energy consumption, computational capabilities and memory requirements. In this document we discuss and evaluate several learning algorithms according to their suitability for intrusion and attack detection. Learning algorithms subject to evaluation include bio-inspired approaches such as Artificial Immune Systems or Neural Networks, and classical such as Decision Trees, Bayes classifier, Support Vector Machines, k-Nearest Neighbors and others. We conclude that, in our setup, the more simplistic approaches such as Decision Trees or Bayes classifier offer a reasonable performance. The performance was, however, found to be significantly dependent on the feature representation.
  • Quality control of a light metal die casting process using artificial neural networks
    Matthias Becker
    2009 IEEE International Conference on Computational Cybernetics (ICCC)
    In this work we present an approach that uses a neural net for an online control of the cooling process in light metal die casting industry. Normally the die casting process is controlled manually or semi-manually, and quality control is done well after the cooling process. In our approach we increase the product quality during the production process by monitoring the cooling process with an infra red camera and heating or cooling different parts of the mold. The control is done using a neural net, which has been trained with data from previous casting processes, where the quality has been judged by experts. We conclude that this approach is a feasible way to online monitor and increase product quality in die casting.

2008

2007

  • DISCRETE EVENT SYSTEMS--PETRI NET-BASED MODELING AND SIMULATION IN THEORY AND PRACTICE
    Matthias Becker
    Eurosim
    The theory of modeling formalisms for Discrete Event Systems has a long history and is well developed, many algorithms for modeling and efficient analysis of the modeled systems exist. However in many practical applications or commercial software, the theory is not used. The reasons are manifold. The question is, whether the theoretical concepts are not suited for practical applications, or whether the problem lies in the proper transfer to practice. Other problems lie in the sometimes missing flexibility of theoretical models, to some extend in missing good software that would enable the practical use of such models. In this work we review Petri net based methodologies with regard to their applicability in practice, and try to understand why many aspects of the theory of modeling and simulation do not find their way into practice. We identify crucial factors such as support of complex models and hierarchic modeling capabilities. These factors not only concern the modeling methodology, but also need to be implemented in a software tool. The availability of a software supporting a modeling concept is another important factor. The software should also have an adequate and appealing graphical representation because at the end, the practitioners have to be convinced to’buy’the theoretical concept, and that will only be the case, if decision makers can recognize’their’system easily. Furthermore we survey a number of papers about application of Petri nets to find out to which extend these applications are practical ones, ie whether the applications are of academic nature, proof of concept, toy size or inside a productive environment.
  • Performance of routing protocols for real wireless sensor networks
    Matthias Becker, Sven Schaust and Eugen Wittmann
    Proceedings of the 10th International Symposium on Performance Evaluation of Computer and Telecommunication Systems - SPECTS '07
    The main task of a Wireless Sensor Network (WSN) is to collect data and either send it to a base station immediately or to store it locally until the data is requested by a base station. WSN form a wireless network without specific infrastructure thus efficient routing protocols are necessary to let a data packet find its way from one specific sensor node through the network to the base station. Since WSN are a quite new technology, in a first step existing routing protocols from other types of wireless networks have been employed in WSN. However these protocols are not well suited for WSN, since the characteristics of the technology and the application of other wireless networks may be quite different from those in WSNs. As consequence the adopted routing protocols often perform badly in the context of WSN. In this work we study the usability of several routing protocols in a real world environmental monitoring task and show how the performance of wireless routing protocols can be improved significantly if adapted carefully for the use in WSN. Finally, the performance of the different available routing protocols is then measured and compared through actual deployment of the WSN using cricket motes, which have been designed by U.C. Berkeley.
  • Generating Interactive 3-D Models for Discrete-Event Modeling Formalisms
    Matthias Becker
    Cyberworlds, 2007. CW'07. International Conference on
    In this paper an automatic transformation of arbitrary manufacturing models (modeled as queuing net or stochastic Petri net) to an interactive 3D visualization and animation (realized in a game-engine) is presented. The motivation behind this is, to make the very useful but rather boring mathematical models formulated as queuing net or Petri net easily accessible for users and decision makers, who are not interested in the details of the mathematical modeling formalism. Queuing networks and Petri nets have a long tradition and are a well accepted means for modeling, simulation and analysis of discrete event systems. Although these formalisms have a graphical notation intuitive to use for experts, they lack a good presentation layer which is needed for acceptance in industry or for commercial purposes, or for academic non experts.
  • NEURAL NETWORKS AND OPTIMIZATION ALGORITHMS APPLIED FOR CONSTRUCTION OF LOW NOISE TREAD PROFILES
    Matthias Becker, Helena Szczerbicka and Michael Thomas
    Cybernetics and Systems: An International Journal
    In this article we evaluate and compare diverse methodologies for designing low-noise tread profiles. Finding a low noise tread profile under given constraints can be described as a search in search space which is typically of the order of a 50– to 70-dimensional vector space. A complete search for the optimal tread profile is not possible even with today's computers. Thus in this work we compare the feasibility of three classes of algorithms for tread profile construction. First, we discuss approaches of speeding up the generation and analysis of tread profiles. Second we use two algorithms for iterative construction of large tread profiles out of several smaller tread profiles known to be of good quality. One of these algorithms is based on Neural Networks. Third, we evaluate heuristic optimization algorithms such as Genetic Algorithms and Simulated Annealing. Last we compare suitability and efficiency of our approaches.

2006

2005

2003

  • Modeling and simulation of a complete semiconductor manufacturing facility using Petri nets
    Matthias Becker
    Emerging Technologies and Factory Automation, 2003. Proceedings. ETFA'03. IEEE Conference
    Most studies employing Petri nets in semiconductor manufacturing model only one specific area (e.g. etching) in detail, and model the rest of the manufacturing process, e.g. by abstract input/output behavior. In our study, we show the feasibility of using Petri nets for modeling the complete production process. We use the first set of test data provided by the MASM-LAB, Arizona State University. It is a process of a two-product system making non-volatile memory chips. For modeling, we use our own tool PSim, which is based on a combined queuing and Petri net formalism. The integration of queues makes the modeling of parts waiting in front of a machine quite concise and intuitive. PSim offers a hierarchical and modular modeling approach, which is especially feasible for large and complex systems. We use the modular approach by once defining the structure of a machine as Petri net and then instantiating as many machines as needed. Then we model the operators, resources and the movement of parts between the machines as specified in the production plan. As a result, we can state that Petri nets are feasible for modeling a complete semiconductor manufacturing process.
  • Planning the Reconstruction of a Shiplift by Simulation of a Stochastic Petri Net Model
    Matthias Becker and Thomas Bessey
    European Simulation Symposium
    In this case study, two alternatives for reconstruction of an existing shiplift are evaluated. At the moment, the shiplift consists of two long chambers. One chamber is to be rebuilt. Instead of rebuilding it in its original length, a shorter and cheaper chamber could also be built.
  • A STUDY OF CONTROL VIA ON-LINE SIMULATION USING STOCHASTIC PETRI NETS
    Matthias Becker, Thomas Bessey and Helena Szczerbicka
    European Simulation Symposium (ESS)
    Complex systems such as flexible manufacturing systems and traffic systems typically evolve with alternating periods of transient and nearly steady-state behavior; such systems often show suboptimal performance. Thus, it is desirable to optimize the system’s performance on-line by adjusting the system’s parameters properly before a performance drop is to occur. To this end, the system’s future evolution is assessed in advance repeatedly by means of on-line simulation. However, there are several problems accompanying this approach, particularly the demand of real-time decisions, that have not been sufficiently solved yet. Aiming at studying the dynamics of on-line control as well as its impact on the system’s operation, we built a stochastic Petri net model that simulates online control of a simple open queueing network as it performs by means of on-line simulation. The system under control is easy to study since it has known properties and can be considered as part of a manufacturing system; jobs arriving at the system have to be dispatched to one of two machines, each providing a queue for jobs waiting to be processed. The processing times of the machines are deterministic or stochastic, while the jobs’ arrival times are stochastic. With on-line simulation, the system’s future performance is assessed by virtually dispatching a new job to either of the machines, based on the system’s current state; the results are compared and thus lead to the real decision concerning to what machine the new job should be dispatched in order to minimize the work in progress.

2002

2001

2000

1999

  • PNiQ: Integration of queuing networks in generalised stochastic Petri nets
    Matthias Becker and Helena Szczerbicka
    IEE Proceedings-Software
    Generalised stochastic Petri nets (GSPN) and queuing networks are combined at the modelling level by defining Petri Nets including Queuing Networks (PNiQ). The definition is especially designed to allow approximate analysis by aggregation of the queuing nets and replacing them with GSPN elements. Usually the aggregation of combined GSPN and queuing network models is carried out manually which limits the use of this technique to experts and furthermore may easily lead to modelling errors and larger approximation errors than inherent in the method. These are avoided by the definition of PNiQ which shows how to incorporate queuing networks into GSPN and provides interfaces between them. This makes combined modelling easier and less error-prone. Steady state analysis of the model can be carried out automatically: queuing network parts are analysed with efficient queuing network algorithms for large nets and replaced by GSPN subnets that model the delay of tokens in the queuing network. The resulting GSPN can then be handled with state-of-the-art tools.

1998