One way to improve the energy efficiency of buildings is through energy models that simulate various aspects such as power, cooling, and heat loss through windows. Until now, however, building accurate models for diverse building designs has been very difficult. Over at NICS, Scott Gibson writes that supercomputer-assisted calibration methodology from Oak Ridge National Labs is being used to enhance the accuracy of these energy models.
The cost required to collect data and tune a model to such accuracy involves so much manual effort that it is rarely employed—outside of research—for energy-service company projects smaller than $1 million,” New says. “An automated methodology for model calibration that realistically adjusts input parameters would eliminate risk from energy savings estimates and open up new business opportunities and energy-savings performance contracts in the light commercial and residential sectors. A cost-effective methodology that can meet Guideline 14 requirements is estimated to lead to a cumulative U.S. energy savings of 27.4 TBtus per year, or $1.6 billion annually.”
Over at ORNL, Joshua New and Jibo Sanyal are leading a project called Autotune that uses advanced analytical and optimization methodology to tackle this problem. Leveraging terabytes of HPC-generated simulation data, Autotune uses data mining with multiple-machine learning algorithms to quickly calibrate a building energy model to measured (utility or sensor) data. Read the Full Story.