To content
Department of Computer Science

Research focus

  • Quantitative Models: Description and Analysis
  • Discrete optimization under uncertainty
  • Simulation and optimization methods to support industrial adaptation processes
  • Structured analysis of large data sets and high-dimensional data
  • Computer networks and distributed systems

The traditional focus of research activities at Chair IV is methodology and techniques for performance evaluation of computing and communication systems, as well as logistic networks. More recently, research in the area of optimization both under uncertain information and for time-critical real-time control of factory areas has also been added.

Event-oriented, stochastic models are used for the performance evaluation of systems. The specification of such a model includes the description of its (discrete) state space as well as the (possibly stochastic) rules of its state transitions. Common techniques for analyzing such models (and thus for system evaluation) derive from categories characterized as "algebraic'', "numerical'', and "simulative''. Numerous research activities were and are aimed at providing broad support for the specification and analysis of appropriate models, such as the consideration of correlations in the specification of arrival and service processes or the computation of bounds for determining guaranteed assurances in service level agreements. In many cases, the aforementioned research activities lead to concrete tool support, such as the tool ProFiDo (Processes Fitting Toolkit Dortmund), which supports a variety of procedures for fitting and modeling arrival processes, or the SLA tool, which supports the efficient computation of bounds in hierarchical systems for typical quantitative quantities such as response times and service capacities.

During the development and operation of technical systems, numerous configuration and design decisions have to be made in order to provide the required services in the most resource-efficient and cost-effective way. These decisions are based on the solution of discrete or mixed discrete-continuous optimization problems, which are often difficult to solve because in the course of the "combinatorial explosion" the number of alternative solutions grows exponentially with the number of decisions between discrete alternatives. Many practical problems are therefore greatly simplified to make them amenable to algorithmic solution. Real decisions, moreover, usually have to be made on the basis of incomplete knowledge. The resulting uncertainty is usually not taken into account in today's standard optimization approaches, although it can lead to significant deviations between the determined solution and the real optimum in individual cases. In our research activities we consider, among others, numerically solvable models, such as Markov Decisions Processes (MDPs), whose parameters, in contrast to the classical specification, are not uniquely fixed, but are described by discrete or continuous probability distributions.

In an ever more intensively changing business environment, the demands on the adaptability of factories are growing. The shortest possible time to implement the necessary adjustments is crucial for the competitiveness of the company. Nowadays, the adaptation process is often supported by model-based investigations, which have to consider real-time data for valid evaluation. Simulation and forecasting techniques can be used to derive statements about future system behavior, which are then used in corresponding optimization methods to determine a factory configuration that is well adapted to the changes in the environment. The practicality and robustness of the determined optimization solutions can in turn be evaluated by simulation models.

Today, immense amounts of data can be collected and generated in many application areas. This data helps to better understand, evaluate and control systems. In order to obtain the necessary information from the data, it must be stored, analyzed and modeled. This requires new data structures and algorithms. Our research work deals with the analysis of large structured stochastic models and the evaluation of high-dimensional simulation data. The methods are based on special data structures, such as hierarchical Kronecker representations, tensor trains, or the Tucker decomposition, to store high-dimensional structures compactly without losing too much information. We are researching the development of numerical algorithms to perform efficient analyses based on the compact data structures.