The GRESB Data Quality Standard is paving the path to digital facilities management for the CRE industry. After launching more than 10 years ago with the goal of providing quality Environment, Social and Governance (ESG) data about real estate investments to the capital markets, GRESB is now focused on leveraging technology to elevate the quality of non-financial real estate data.
As the newly minted VP of Operations at Switch Automation, I had the pleasure of visiting New York City to join GRESB’s Data Quality Technical Working Group. Switch is a Premier GRESB Partner, which means that our Engineering and Leadership teams are working with GRESB’s technical committee to establish a data quality standard for the industry.
Evolving an industry
GRESB is drawing on data quality best practices across industries to apply a new standard to how GRESB reports ESG data. This new standard seeks to reduce reporting and validation
The GRESB Data Quality Technical Working Group, a cohort, of 30+ industry experts and GRESB partners with varied subject matter expertise, is an integral part of the Standard’s evolution process. With oversight from the GRESB benchmark Committees and Advisory boards, this team will define data quality rules and inform the content, structure and evolution of the Data Quality Standard. No easy task, given the many challenges real estate teams face when operating large portfolios.
Consistency is key
The Standard will need to consider several factors. Every building is unique and equipment is often proprietary, varying by site. Multiple contractors spanning many disciplines add to a building’s intricate operations. A variety of architectures including cloud, on premise and vendor-hosted, often silo data into sometimes unreachable places. Diverse communications and protocols, like BACnet, Lon and Modbus mean different languages for data interpreters.
Inconsistent naming and tagging standards add further complexity to the pool of raw data. If all that weren’t enough to tackle, there’s the onslaught of cybersecurity threats and plethora of IoT sensors and platforms designed to gather building intelligence. Our industry is rapidly evolving and the traditional facilities workforce isn’t equipped with the IT expertise needed to ensure data quality and actualize its valuable insights in this digital world. How can we advance while optimizing our assets for the highest performance?
Activating digital facilities management
Digital facilities management is the way of the future, but the journey requires thoughtful investment of time, money and human capital. To better leverage our precious human resources, we’ll need to shift away from spreadsheets and data wrangling, instead leaning in to technological solutions. By automating tedious manual data capture and using existing standards like Brick and Haystack to translate our mountains of information, we can make more timely, cost-effective decisions about how to manage our buildings.
Does this mean humans are out? Quite the contrary. By embracing technology to turn challenges into opportunities, we’re reimagining our industry while creating desirable careers for today’s emerging workforce.
As we begin the journey from manual data aggregation, analysis and reporting toward tech-enabled data platforms and processes, what should we look for? Listening to some of the experts in the room this week, an ideal solution should:
- Scale to accommodate portfolio changes and technology advancements
- Provide a variety of data ingestion methods, i.e. SFTP, CSV, API
- Support industry-recognized standards like Brick Schema, Haystack and ASHRAE
- Auto-validate incoming data
- Normalize data to drive actionable insights
- Go beyond energy efficiency to drive timely, informed business decisions
- Enable collaboration between end users, site technicians and 3rd party vendors
The Standard for success
Every CRE owner and operator is challenged by the Broken Buildings Hierarchy of Needs. In the quest to achieve data quality, sustainability