TY - GEN
T1 - A Markov decision process approach to multi-timescale scheduling and pricing in smart grids with integrated wind generation
AU - He, Miao
AU - Murugesan, Sugumar
AU - Zhang, Junshan
PY - 2011/12/1
Y1 - 2011/12/1
N2 - In this study, we tackle the challenge of integrating volatile wind generation into the bulk power systems, by lever-aging multi-timescale scheduling and pricing with two classes of energy users - traditional energy users and opportunistic energy users (e.g., electric vehicles or smart appliances). In day-ahead scheduling, with the distributional information of wind generation and energy demands, decisions on the optimal procurement of conventional energy supply and the day-ahead retail price are made; in real-time scheduling, with the realization of wind generation, the load of traditional energy users, the real-time prices are announced to manage the demand of opportunistic energy users so as to achieve system-wide reliability. Focusing on the case when the opportunistic energy users are persistent, i.e., they stay in the system until a real-time retail price is acceptable, we formulate the scheduling problem as a multi-timescale Markov decision process with special characteristics. We then show that it can be recast, explicitly, as a classic Markov decision process with continuous state and action spaces, the solution to which can be found via standard techniques.
AB - In this study, we tackle the challenge of integrating volatile wind generation into the bulk power systems, by lever-aging multi-timescale scheduling and pricing with two classes of energy users - traditional energy users and opportunistic energy users (e.g., electric vehicles or smart appliances). In day-ahead scheduling, with the distributional information of wind generation and energy demands, decisions on the optimal procurement of conventional energy supply and the day-ahead retail price are made; in real-time scheduling, with the realization of wind generation, the load of traditional energy users, the real-time prices are announced to manage the demand of opportunistic energy users so as to achieve system-wide reliability. Focusing on the case when the opportunistic energy users are persistent, i.e., they stay in the system until a real-time retail price is acceptable, we formulate the scheduling problem as a multi-timescale Markov decision process with special characteristics. We then show that it can be recast, explicitly, as a classic Markov decision process with continuous state and action spaces, the solution to which can be found via standard techniques.
UR - http://www.scopus.com/inward/record.url?scp=84863129631&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84863129631&partnerID=8YFLogxK
U2 - 10.1109/CAMSAP.2011.6135903
DO - 10.1109/CAMSAP.2011.6135903
M3 - Conference contribution
AN - SCOPUS:84863129631
SN - 9781457721052
T3 - 2011 4th IEEE International Workshop on Computational Advances in Multi-Sensor Adaptive Processing, CAMSAP 2011
SP - 125
EP - 128
BT - 2011 4th IEEE International Workshop on Computational Advances in Multi-Sensor Adaptive Processing, CAMSAP 2011
T2 - 2011 4th IEEE International Workshop on Computational Advances in Multi-Sensor Adaptive Processing, CAMSAP 2011
Y2 - 13 December 2011 through 16 December 2011
ER -