This paper describes an investigation of machine learning for supervisory control of active and passive thermal storage capacity in buildings. Previous studies show that the utilization of active or passive thermal storage, or both, can yield significant peak cooling load reduction and associated electrical demand and operational cost savings. In this study, a model-free learning control is investigated for the operation of electrically driven chilled water systems in heavy-mass commercial buildings. The reinforcement learning controller learns to operate the building and cooling plant based on the reinforcement feedback (monetary cost of each action, in this study) it receives for past control actions. The learning agent interacts with its environment by commanding the global zone temperature setpoints and thermal energy storage charging∕discharging rate. The controller extracts information about the environment based solely on the reinforcement signal; the controller does not contain a predictive or system model. Over time and by exploring the environment, the reinforcement learning controller establishes a statistical summary of plant operation, which is continuously updated as operation continues. The present analysis shows that learning control is a feasible methodology to find a near-optimal control strategy for exploiting the active and passive building thermal storage capacity, and also shows that the learning performance is affected by the dimensionality of the action and state space, the learning rate and several other factors. It is found that it takes a long time to learn control strategies for tasks associated with large state and action spaces.
Skip Nav Destination
e-mail: sliu@bes-tech.net
e-mail: ghenze@unl.edu
Article navigation
Research Papers
Evaluation of Reinforcement Learning for Optimal Control of Building Active and Passive Thermal Storage Inventory
Simeng Liu,
e-mail: sliu@bes-tech.net
Simeng Liu
Bes-Tech Inc.
, 3910 South Interstate Highway 35, Suite 225, Austin, TX 78704
Search for other works by this author on:
Gregor P. Henze
Gregor P. Henze
Department of Architectural Engineering,
e-mail: ghenze@unl.edu
University of Nebraska-Lincoln
, Omaha, NE 68182
Search for other works by this author on:
Simeng Liu
Bes-Tech Inc.
, 3910 South Interstate Highway 35, Suite 225, Austin, TX 78704e-mail: sliu@bes-tech.net
Gregor P. Henze
Department of Architectural Engineering,
University of Nebraska-Lincoln
, Omaha, NE 68182e-mail: ghenze@unl.edu
J. Sol. Energy Eng. May 2007, 129(2): 215-225 (11 pages)
Published Online: October 31, 2006
Article history
Received:
May 22, 2005
Revised:
October 31, 2006
Citation
Liu, S., and Henze, G. P. (October 31, 2006). "Evaluation of Reinforcement Learning for Optimal Control of Building Active and Passive Thermal Storage Inventory." ASME. J. Sol. Energy Eng. May 2007; 129(2): 215–225. https://doi.org/10.1115/1.2710491
Download citation file:
Get Email Alerts
Mass Flow Control Strategy for Maximum Energy Extraction in Thermal Energy Storage Tanks
J. Sol. Energy Eng (December 2025)
Related Articles
Parametric Analysis of Active and Passive Building Thermal Storage Utilization
J. Sol. Energy Eng (February,2005)
Sensitivity Analysis of Optimal Building Thermal Mass Control
J. Sol. Energy Eng (November,2007)
Near-Optimal Receding Horizon Control of Thermal Energy Storage
J. Energy Resour. Technol (June,2022)
Energy and Cost Minimal Control of Active and Passive Building Thermal Storage Inventory
J. Sol. Energy Eng (August,2005)
Related Proceedings Papers
Related Chapters
QP Based Encoder Feedback Control
Robot Manipulator Redundancy Resolution
Modeling and Optimal Control for Batch Cooling Crystallization
International Conference on Advanced Computer Theory and Engineering, 4th (ICACTE 2011)
Feedback-Aided Minimum Joint Motion
Robot Manipulator Redundancy Resolution