Abstract
Appropriate inpatient staffing levels minimize hospital cost and increase patient safety. Hospital inpatient units dynamically adjust premium staffing (above base staffing) levels by attempting to match their daily demand. Historically, inpatient managers subjectively adjust daily staffing from observing the morning inpatient inventory. Inpatient units strive to match staff with demand in a complex patient throughput environment where service rates and non-stationary profiles are not explicitly known. Related queue control and throughput modeling literature do not directly match staffing with demand, require explicit service process knowledge, and are not formulated for an inpatient unit. This paper presents a Markov decision process (MDP) for dynamic inpatient staffing. The MDP explicitly attempts to match staffing with demand, has a statistical discrete time Markov chain foundation that estimates the service process, predicts transient inventory, and is formulated for an inpatient unit. Lastly, the MDP application to a telemetry unit reveals a computational myopic, an approximate stationary, and a finite horizon optimal policy that is validated through hospital expert experience. The application reveals difficult-to-staff inventory levels and shows that the removal of discharge seasonality can drastically decrease required size of the premium staffing pool and the probability of full occupancy thus improving the inpatient unit's patient flow.
Original language | English (US) |
---|---|
Pages (from-to) | 116-130 |
Number of pages | 15 |
Journal | IIE Transactions on Healthcare Systems Engineering |
Volume | 1 |
Issue number | 2 |
DOIs | |
State | Published - 2011 |
Keywords
- dynamic staffing
- hospitals
- Markov decision processes
- staffing policy
- stochastic processes
ASJC Scopus subject areas
- Public Health, Environmental and Occupational Health
- Safety, Risk, Reliability and Quality
- Safety Research