Discrete event simulation has been widely applied to study the behavior of stochastic manufacturing systems. This is due to the fact that manufacturing systems are usually too complex to obtain a closed-form analytical model that can accurately predict their performance. This becomes particularly critical when the optimization of these systems is of concern. In fact, Simulation optimization techniques are employed to identify the manufacturing system configuration which can maximize the expected system performance when this can only be estimated by running a simulator. In this article, we look into simulation-based optimization when a finite number of solutions are available and we have to identify the best. In particular, we propose, for the first time, the integration of Optimal Computing Budget Allocation (OCBA), which is based on independent measures from each simulation experiment, and Time Dilation (TD), which is a single-run simulation optimization algorithm. As a result, the optimization problem is solved when only one experiment of the system is performed by changing the “speed” of the simulation at each configuration in order to control the computational effort. The challenge is how to iteratively select such a speed. We solve this problem by proposing TD-OCBA, which integrates TD and OCBA while relying on standardized time series variance estimators. Numerical experiments have been conducted to study the performance of the algorithm when the response is generated from a time series. This provides the possibility to test the robustness of TD-OCBA. Comparison between TD-OCBA and the original TD method was performed by simulating a job shop system reported in the literature. Finally, an application involving semiconductors remote diagnostics is used to compare the TD-OCBA method and what is known as the equal allocation method.
- Discrete event systems
- stochastic simulation
ASJC Scopus subject areas
- Industrial and Manufacturing Engineering