空 挡 广 告 位 | 空 挡 广 告 位

Targeting occupant feedback using digital twins: Adaptive spatial-temporal thermal preference sampling to optimize personal comfort models

Note: We don't have the ability to review paper

PubDate: Feb 2022

Teams: National University of Singapore

Writers: Mahmoud Abdelrahman, Clayton Miller

PDF: Targeting occupant feedback using digital twins: Adaptive spatial-temporal thermal preference sampling to optimize personal comfort models

Abstract

Collecting intensive longitudinal thermal preference data from building occupants is emerging as an innovative means of characterizing the performance of buildings and the people who use them. These techniques have occupants giving subjective feedback using smartphones or smartwatches frequently over the course of days or weeks. The intention is that the data will be collected with high spatial and temporal diversity to best characterize a building and the occupant’s preferences. But in reality, leaving the occupant to respond in an ad-hoc or fixed interval way creates unneeded survey fatigue and redundant data. This paper outlines a scenario-based (virtual experiment) method for optimizing data sampling using a smartwatch to achieve comparable accuracy in a personal thermal preference model with less data. This method uses BIM-extracted spatial data, and Graph Neural Network (GNN) based modeling to find regions of similar comfort preference to identify the best scenarios for triggering the occupant to give feedback. This method is compared to two baseline scenarios based on the spatial context of specific spaces and 4 x 4 m grid squares in the building using a theoretical implementation on two field-collected data sets. The results show that the proposed Build2Vec method is 18-23% more in the overall sampling quality than the spaces-based and the square-grid-based sampling methods. The Build2Vec method also has similar performance to the baselines when removing redundant occupant feedback points but with better scalability potential.

您可能还喜欢...

Paper