A Markov decision process to dynamically match hospital inpatient staffing to demand

James R. Broyles, Jeffery K. Cochran, Douglas Montgomery

Research output: Contribution to journalArticle

2 Citations (Scopus)

Abstract

Appropriate inpatient staffing levels minimize hospital cost and increase patient safety. Hospital inpatient units dynamically adjust premium staffing (above base staffing) levels by attempting to match their daily demand. Historically, inpatient managers subjectively adjust daily staffing from observing the morning inpatient inventory. Inpatient units strive to match staff with demand in a complex patient throughput environment where service rates and non-stationary profiles are not explicitly known. Related queue control and throughput modeling literature do not directly match staffing with demand, require explicit service process knowledge, and are not formulated for an inpatient unit. This paper presents a Markov decision process (MDP) for dynamic inpatient staffing. The MDP explicitly attempts to match staffing with demand, has a statistical discrete time Markov chain foundation that estimates the service process, predicts transient inventory, and is formulated for an inpatient unit. Lastly, the MDP application to a telemetry unit reveals a computational myopic, an approximate stationary, and a finite horizon optimal policy that is validated through hospital expert experience. The application reveals difficult-to-staff inventory levels and shows that the removal of discharge seasonality can drastically decrease required size of the premium staffing pool and the probability of full occupancy thus improving the inpatient unit's patient flow.

Original languageEnglish (US)
Pages (from-to)116-130
Number of pages15
JournalIIE Transactions on Healthcare Systems Engineering
Volume1
Issue number2
DOIs
StatePublished - 2011

Fingerprint

Markov Chains
staffing
Inpatients
demand
Throughput
staffing level
Telemetering
premium
Markov processes
Managers
staff
Equipment and Supplies
Costs
Telemetry
Hospital Units
Hospital Costs
expert
manager
Patient Safety
costs

Keywords

  • dynamic staffing
  • hospitals
  • Markov decision processes
  • staffing policy
  • stochastic processes

ASJC Scopus subject areas

  • Public Health, Environmental and Occupational Health
  • Safety, Risk, Reliability and Quality
  • Safety Research

Cite this

A Markov decision process to dynamically match hospital inpatient staffing to demand. / Broyles, James R.; Cochran, Jeffery K.; Montgomery, Douglas.

In: IIE Transactions on Healthcare Systems Engineering, Vol. 1, No. 2, 2011, p. 116-130.

Research output: Contribution to journalArticle

@article{1b8b454c5a544fd6bbf5e1df04c974a7,
title = "A Markov decision process to dynamically match hospital inpatient staffing to demand",
abstract = "Appropriate inpatient staffing levels minimize hospital cost and increase patient safety. Hospital inpatient units dynamically adjust premium staffing (above base staffing) levels by attempting to match their daily demand. Historically, inpatient managers subjectively adjust daily staffing from observing the morning inpatient inventory. Inpatient units strive to match staff with demand in a complex patient throughput environment where service rates and non-stationary profiles are not explicitly known. Related queue control and throughput modeling literature do not directly match staffing with demand, require explicit service process knowledge, and are not formulated for an inpatient unit. This paper presents a Markov decision process (MDP) for dynamic inpatient staffing. The MDP explicitly attempts to match staffing with demand, has a statistical discrete time Markov chain foundation that estimates the service process, predicts transient inventory, and is formulated for an inpatient unit. Lastly, the MDP application to a telemetry unit reveals a computational myopic, an approximate stationary, and a finite horizon optimal policy that is validated through hospital expert experience. The application reveals difficult-to-staff inventory levels and shows that the removal of discharge seasonality can drastically decrease required size of the premium staffing pool and the probability of full occupancy thus improving the inpatient unit's patient flow.",
keywords = "dynamic staffing, hospitals, Markov decision processes, staffing policy, stochastic processes",
author = "Broyles, {James R.} and Cochran, {Jeffery K.} and Douglas Montgomery",
year = "2011",
doi = "10.1080/19488300.2011.609523",
language = "English (US)",
volume = "1",
pages = "116--130",
journal = "IISE Transactions on Healthcare Systems Engineering",
issn = "2472-5579",
publisher = "Taylor and Francis Ltd.",
number = "2",

}

TY - JOUR

T1 - A Markov decision process to dynamically match hospital inpatient staffing to demand

AU - Broyles, James R.

AU - Cochran, Jeffery K.

AU - Montgomery, Douglas

PY - 2011

Y1 - 2011

N2 - Appropriate inpatient staffing levels minimize hospital cost and increase patient safety. Hospital inpatient units dynamically adjust premium staffing (above base staffing) levels by attempting to match their daily demand. Historically, inpatient managers subjectively adjust daily staffing from observing the morning inpatient inventory. Inpatient units strive to match staff with demand in a complex patient throughput environment where service rates and non-stationary profiles are not explicitly known. Related queue control and throughput modeling literature do not directly match staffing with demand, require explicit service process knowledge, and are not formulated for an inpatient unit. This paper presents a Markov decision process (MDP) for dynamic inpatient staffing. The MDP explicitly attempts to match staffing with demand, has a statistical discrete time Markov chain foundation that estimates the service process, predicts transient inventory, and is formulated for an inpatient unit. Lastly, the MDP application to a telemetry unit reveals a computational myopic, an approximate stationary, and a finite horizon optimal policy that is validated through hospital expert experience. The application reveals difficult-to-staff inventory levels and shows that the removal of discharge seasonality can drastically decrease required size of the premium staffing pool and the probability of full occupancy thus improving the inpatient unit's patient flow.

AB - Appropriate inpatient staffing levels minimize hospital cost and increase patient safety. Hospital inpatient units dynamically adjust premium staffing (above base staffing) levels by attempting to match their daily demand. Historically, inpatient managers subjectively adjust daily staffing from observing the morning inpatient inventory. Inpatient units strive to match staff with demand in a complex patient throughput environment where service rates and non-stationary profiles are not explicitly known. Related queue control and throughput modeling literature do not directly match staffing with demand, require explicit service process knowledge, and are not formulated for an inpatient unit. This paper presents a Markov decision process (MDP) for dynamic inpatient staffing. The MDP explicitly attempts to match staffing with demand, has a statistical discrete time Markov chain foundation that estimates the service process, predicts transient inventory, and is formulated for an inpatient unit. Lastly, the MDP application to a telemetry unit reveals a computational myopic, an approximate stationary, and a finite horizon optimal policy that is validated through hospital expert experience. The application reveals difficult-to-staff inventory levels and shows that the removal of discharge seasonality can drastically decrease required size of the premium staffing pool and the probability of full occupancy thus improving the inpatient unit's patient flow.

KW - dynamic staffing

KW - hospitals

KW - Markov decision processes

KW - staffing policy

KW - stochastic processes

UR - http://www.scopus.com/inward/record.url?scp=84992445199&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84992445199&partnerID=8YFLogxK

U2 - 10.1080/19488300.2011.609523

DO - 10.1080/19488300.2011.609523

M3 - Article

VL - 1

SP - 116

EP - 130

JO - IISE Transactions on Healthcare Systems Engineering

JF - IISE Transactions on Healthcare Systems Engineering

SN - 2472-5579

IS - 2

ER -