Abstract

Discrete event simulation has been widely applied to study the behavior of stochastic manufacturing systems. This is due to the fact that manufacturing systems are usually too complex to obtain a closed-form analytical model that can accurately predict their performance. This becomes particularly critical when the optimization of these systems is of concern. In fact, Simulation optimization techniques are employed to identify the manufacturing system configuration which can maximize the expected system performance when this can only be estimated by running a simulator. In this article, we look into simulation-based optimization when a finite number of solutions are available and we have to identify the best. In particular, we propose, for the first time, the integration of Optimal Computing Budget Allocation (OCBA), which is based on independent measures from each simulation experiment, and Time Dilation (TD), which is a single-run simulation optimization algorithm. As a result, the optimization problem is solved when only one experiment of the system is performed by changing the “speed” of the simulation at each configuration in order to control the computational effort. The challenge is how to iteratively select such a speed. We solve this problem by proposing TD-OCBA, which integrates TD and OCBA while relying on standardized time series variance estimators. Numerical experiments have been conducted to study the performance of the algorithm when the response is generated from a time series. This provides the possibility to test the robustness of TD-OCBA. Comparison between TD-OCBA and the original TD method was performed by simulating a job shop system reported in the literature. Finally, an application involving semiconductors remote diagnostics is used to compare the TD-OCBA method and what is known as the equal allocation method.

Original languageEnglish (US)
Pages (from-to)219-231
Number of pages13
JournalIISE Transactions
Volume51
Issue number3
DOIs
StatePublished - Mar 4 2019

Fingerprint

Time series
Experiments
Discrete event simulation
Analytical models
Simulators
Semiconductor materials

Keywords

  • Discrete event systems
  • optimization
  • stochastic simulation

ASJC Scopus subject areas

  • Industrial and Manufacturing Engineering

Cite this

TD-OCBA : Optimal computing budget allocation and time dilation for simulation optimization of manufacturing systems. / Zhu, Yinchao; Pedrielli, Giulia; Hay Lee, Loo.

In: IISE Transactions, Vol. 51, No. 3, 04.03.2019, p. 219-231.

Research output: Contribution to journalArticle

@article{13b49a429cda4a6183d3bebdb8581ff1,
title = "TD-OCBA: Optimal computing budget allocation and time dilation for simulation optimization of manufacturing systems",
abstract = "Discrete event simulation has been widely applied to study the behavior of stochastic manufacturing systems. This is due to the fact that manufacturing systems are usually too complex to obtain a closed-form analytical model that can accurately predict their performance. This becomes particularly critical when the optimization of these systems is of concern. In fact, Simulation optimization techniques are employed to identify the manufacturing system configuration which can maximize the expected system performance when this can only be estimated by running a simulator. In this article, we look into simulation-based optimization when a finite number of solutions are available and we have to identify the best. In particular, we propose, for the first time, the integration of Optimal Computing Budget Allocation (OCBA), which is based on independent measures from each simulation experiment, and Time Dilation (TD), which is a single-run simulation optimization algorithm. As a result, the optimization problem is solved when only one experiment of the system is performed by changing the “speed” of the simulation at each configuration in order to control the computational effort. The challenge is how to iteratively select such a speed. We solve this problem by proposing TD-OCBA, which integrates TD and OCBA while relying on standardized time series variance estimators. Numerical experiments have been conducted to study the performance of the algorithm when the response is generated from a time series. This provides the possibility to test the robustness of TD-OCBA. Comparison between TD-OCBA and the original TD method was performed by simulating a job shop system reported in the literature. Finally, an application involving semiconductors remote diagnostics is used to compare the TD-OCBA method and what is known as the equal allocation method.",
keywords = "Discrete event systems, optimization, stochastic simulation",
author = "Yinchao Zhu and Giulia Pedrielli and {Hay Lee}, Loo",
year = "2019",
month = "3",
day = "4",
doi = "10.1080/24725854.2018.1488305",
language = "English (US)",
volume = "51",
pages = "219--231",
journal = "IISE Transactions",
issn = "2472-5854",
publisher = "Taylor and Francis Ltd.",
number = "3",

}

TY - JOUR

T1 - TD-OCBA

T2 - Optimal computing budget allocation and time dilation for simulation optimization of manufacturing systems

AU - Zhu, Yinchao

AU - Pedrielli, Giulia

AU - Hay Lee, Loo

PY - 2019/3/4

Y1 - 2019/3/4

N2 - Discrete event simulation has been widely applied to study the behavior of stochastic manufacturing systems. This is due to the fact that manufacturing systems are usually too complex to obtain a closed-form analytical model that can accurately predict their performance. This becomes particularly critical when the optimization of these systems is of concern. In fact, Simulation optimization techniques are employed to identify the manufacturing system configuration which can maximize the expected system performance when this can only be estimated by running a simulator. In this article, we look into simulation-based optimization when a finite number of solutions are available and we have to identify the best. In particular, we propose, for the first time, the integration of Optimal Computing Budget Allocation (OCBA), which is based on independent measures from each simulation experiment, and Time Dilation (TD), which is a single-run simulation optimization algorithm. As a result, the optimization problem is solved when only one experiment of the system is performed by changing the “speed” of the simulation at each configuration in order to control the computational effort. The challenge is how to iteratively select such a speed. We solve this problem by proposing TD-OCBA, which integrates TD and OCBA while relying on standardized time series variance estimators. Numerical experiments have been conducted to study the performance of the algorithm when the response is generated from a time series. This provides the possibility to test the robustness of TD-OCBA. Comparison between TD-OCBA and the original TD method was performed by simulating a job shop system reported in the literature. Finally, an application involving semiconductors remote diagnostics is used to compare the TD-OCBA method and what is known as the equal allocation method.

AB - Discrete event simulation has been widely applied to study the behavior of stochastic manufacturing systems. This is due to the fact that manufacturing systems are usually too complex to obtain a closed-form analytical model that can accurately predict their performance. This becomes particularly critical when the optimization of these systems is of concern. In fact, Simulation optimization techniques are employed to identify the manufacturing system configuration which can maximize the expected system performance when this can only be estimated by running a simulator. In this article, we look into simulation-based optimization when a finite number of solutions are available and we have to identify the best. In particular, we propose, for the first time, the integration of Optimal Computing Budget Allocation (OCBA), which is based on independent measures from each simulation experiment, and Time Dilation (TD), which is a single-run simulation optimization algorithm. As a result, the optimization problem is solved when only one experiment of the system is performed by changing the “speed” of the simulation at each configuration in order to control the computational effort. The challenge is how to iteratively select such a speed. We solve this problem by proposing TD-OCBA, which integrates TD and OCBA while relying on standardized time series variance estimators. Numerical experiments have been conducted to study the performance of the algorithm when the response is generated from a time series. This provides the possibility to test the robustness of TD-OCBA. Comparison between TD-OCBA and the original TD method was performed by simulating a job shop system reported in the literature. Finally, an application involving semiconductors remote diagnostics is used to compare the TD-OCBA method and what is known as the equal allocation method.

KW - Discrete event systems

KW - optimization

KW - stochastic simulation

UR - http://www.scopus.com/inward/record.url?scp=85062108729&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85062108729&partnerID=8YFLogxK

U2 - 10.1080/24725854.2018.1488305

DO - 10.1080/24725854.2018.1488305

M3 - Article

VL - 51

SP - 219

EP - 231

JO - IISE Transactions

JF - IISE Transactions

SN - 2472-5854

IS - 3

ER -