GRESB Performance Data Standard

The GRESB Performance Data Standard is being developed to define guidelines for performance data reporting to the GRESB Benchmark.

Why we are developing the GRESB Performance Data Standard

Ever since our launch almost a decade ago, GRESB has been committed to providing quality ESG data on real asset investments to the capital markets. The real estate ESG benchmark has grown for 10 successive years and now covers more than 1,000 real estate portfolios, worth $4.1 trillion in GAV. The GRESB Infrastructure Assessment is also growing rapidly with 500 funds and assets participating in 2019.

The reason behind this growth has been our commitment to evolve the benchmark in line with the needs of our stakeholders – ensuring it continues to serve the industry that relies on it. With expectations around quality and validation of ESG data increasing, we’re seeing an opportunity to build on our 3-layer validation process and leverage new advances in technology to take the next step in performance data accessibility and reliability in the benchmark.

The process starts with the development of the GRESB Performance Data Standard to define guidelines for performance data reporting. The Standard will draw on best practices and principles in data quality used by experts across different industries, applying them to ESG data reported through GRESB. By providing clear guidance to the market, the Standard will reduce reporting and validation costs, while at the same time improving the reliability and effectiveness of data collection systems used by the industry.

We’re introducing this Standard because we are committed to providing new levels of insight on ESG risks and opportunities and increased investor confidence in data quality. The efforts will improve transparency and engagement for our investor members and provide participants with more valuable market intelligence about the impact of their sustainability programs. We think this is a worthy investment in the long-term evolution of the GRESB Assessment offering newer analytics and better-quality data in the GRESB ecosystem.

How the Standard will be developed

The GRESB Performance Data Standard will be developed following GRESB’s usual process with oversight by the GRESB Benchmark Committees and Advisory Boards. The Performance Data Standard Technical Working Group (TWG) will define guidelines and inform the content, structure and evolution of the Standard. Once a draft Standard has been developed it will be reviewed by GRESB’s Benchmark Committee and then submitted to the Advisory Board for approval.

The TWG will exist alongside GRESB’s other Industry Working Groups that are helping us to iterate and improve our Standards and Assessments. Taken together, GRESB’s Industry Working Groups are an integral part of the Assessment evolution process, ensuring that we move forward together as an industry to address topics relevant to achieving sustainable real assets.

See more information on GRESB Governance Bodies.

Composition of the Performance Data Standard Technical Working Group (TWG)

Members of the TWG include GRESB Partners (1 person per organization), GRESB participants and industry experts who have relevant subject matter expertise as well as the time and interest to participate fully. GRESB is convening the group to ensure that members have:

  • High levels of technical expertise data quality assurance, software engineering and/or data science.
  • Time to attend the working group meetings (see proposed timeline below).
  • Competence and ability to share information, provide technical input and play an active role in Group discussions.
  • Capacity to review documents distributed in advance of Group meetings and prepare inputs.

Members will be required to meet their own costs of attending meetings and contributing to the work of the TWG. They will not be paid consultancy or sitting fees. GRESB will chair the group and publish meeting minutes together with a list of TWG members (individual and organization).

GRESB Global Partner, Measurabl is Founding Partner of the TWG in recognition of their leadership in calculating and assuring ESG data quality.

  • NameOrganization
    Ari FrankelAlexandria Real Estate Equities, Inc.
    Christopher BottenBetter Buildings Partnership
    Hazel M. SuttonBOMA
    Dan SkidmoreBRE
    Henry GilksCarbon Credentials
    Nicola EspositoCBRE
    Joshua KaceCodeGreen Solutions
    Hanae MaedaCSR Design
    Ravi BajajaUL Environment & Sustainability
    Conan O'ConnorEnergy Profiles Limited
    Michael ZatzENERGY STAR
    David SolskyEnvizi
    Nick HoggEvora
    Colin MaFabriq
    Jorge ChapaGBCA
    Erin Vicelja Goby
    Brian TrainorHeitman
    Kees van Alphenhello energy
    Harry EtraHXE Partners
    James LeeiContinuum
    Luuk WesterhofInnax
    Becca RushinJamestown LP
    Eric DuchonLaSalle Investment Management
    Maryham FaragLendlease
    Tema Yara GoodspeedLORD Green Strategies
    Brianna JacksonMeasurabl
    Fulya KocakNareit
    Ethan GilbertPrologis
    Tony PringleQuinn & Partners Inc.
    Carl PaulseRefined Data Solutions Inc.
    Cha ChunghaRe-imagining Cities Foundation
    Hannah Tillmann RE Tech Advisors
    Christopher MaddernSchneider Electric
    Jenny TY LawSwire Properties
    Patti MasonSwitchautomation
    Thomas SaundersThinkstep
    Marta SchantzULI
    Chris PykeUSGBC and ARC
    Christopher HillVerco Global
    Matt AberantWSP
    Randy MossYardi
  • + Meeting held on December 11, 2018:
     Download minutes
     Download slides

    + Meeting held on January 22, 2019:
     Download minutes
     Download slides

    + Meeting held on March 7, 2019:
     Download minutes
     Download slides

    + Meeting held on May 7, 2019:
     Download minutes
     Download slides

    + Meeting held on October 10, 2019:
    Download minutes
    Download slides

    + Meeting held on June 23, 2020:
    Download minutes
    Download slides

GRESB Data Quality Survey Results

  • As part of its ongoing commitment to data quality, GRESB initiated a Technical Working Group (TWG) in late 2018 with the goal of developing a GRESB Performance Data Standard. As a first step towards the development of this standard, GRESB designed a Data Quality Survey together with the TWG. This optional, unscored survey was open to GRESB Data Partners and Participants in the Real Estate Assessment that use the GRESB Asset Portal during the GRESB reporting period, April to July 2019.

    A primary aim of the survey is to work towards the development of the GRESB Performance Data Standard. An important secondary goal is research-oriented: to understand the current landscape of ESG data systems and investigate how to improve the overall quality and trustworthiness of ESG data in the future. This research will be used to inform the development of future versions of the survey and the standard.

    The scope of the 2019 Data Quality Survey is focused on identifying what the data systems of Data Partners and Participants do to address the quality of the data that is transmitted to GRESB. Specifically, the 2019 version of this survey aims to understand what data quality checks are performed on environmental performance data (energy, water, waste, and GHG emissions) and associated building characteristics. The survey is organized around four aspects of data quality: accuracy, completeness, timeliness, and lineage.

  • 13 Data Partners and 3 Real Estate Participants completed the 2019 Data Quality Survey: The 13 data partners supported about one-third of all Real Estate Assessment submissions in 2019.

  • Responses to this survey cover a wide variety of data systems, ranging from systems used for managing a handful of assets up to thousands or tens of thousands.

    Despite this variety, all systems are concerned with data quality, with the majority tracking multiple aspects of quality

    As shown in the chart on the right, the data systems surveyed are used for multiple purposes. All are used to report to GRESB, building management, and sustainability management.


    The chart on the right shows data types tracked. Most likely a result of the fact that all data systems are used to report to GRESB, all data systems track the ESG performance data: energy, water, waste and GHG emissions.

    A majority also track ancillary data such as building characteristics, building certifications, and energy ratings.


    The breakdown of how data gets input into data systems by entry type, for different data types is shown in the chart on the right. The values shown are the averages per data type. Generally, most ESG data is still input into data systems manually, rather than automatically. Other data entry types typically include integrations with third party providers (which can be automatic via API, or manual). GHG emissions is a special case with many data systems calculating GHG emissions directly from the existing energy data, rather than retrieving or receiving it from an external source, whether automatically or manually.

    This chart also reflects well-known issues in ESG data collection: waste data is difficult to collect automatically, while many energy ratings systems have robust systems in place for automatic data management.
  • The survey is organized around four aspects of data quality: accuracy, completeness, timeliness and lineage.

    Accuracy assesses whether the data values stored in the system are the correct values. To be correct, data values must be the right values and must be represented in a consistent and unambiguous form.

    Completeness assesses the amount of data coverage. To qualify as complete, a data set should have values for all expected data points, or some justification for why the value of an expected data point is missing.

    Timeliness assesses the frequency and time differences between

    when the data is collected or provided, and when the data is entered into the data system and therefore available for use or analysis. For data to be considered timely, it should be input into the system as frequently as it is collected, and with as short a delay between collection or provision and entry into the system as possible.

    Lineage assesses the traceability or auditability of the data. To be traceable, the path from the point when and where the data was accessed back to when and where it was collected, issued, or measured should be clear or able to be constructed from the information in the data system.

    As the chart to the right shows, all data systems take at least one aspect of data quality into account. The majority assess data quality along all four aspects.

  • The chart on the right shows accuracy metrics tracked. The majority of data systems assess the accuracy of their energy, water and waste data through multiple metrics.


    This chart looks at how anomalous data points are identified. A majority of data systems have methods in place to identify anomalous data automatically. Once identified, data systems have a mechanism in place to follow up on these anomalous data points with the data source. Depending on the type of anomaly, the data may be corrected or estimated, or a justification or explanation may be required.


    GHG emissions

    In questions on utility data, the survey did not include GHG emissions as an option, as GHG emissions are typically calculated from energy emissions, rather than measured directly from meter readings or invoices. Somewhat unsurprisingly, 100% of data systems calculate GHG emissions directly from the energy data in the systems. The majority of systems do these calculations internally; some use external tools. Some systems take an additional step to ensure quality and cross-check their internal calculations against external tools.

  • The chart shows how missing data points are identified. The majority of data systems automatically track missing data. As with anomalous data, the majority of data systems then follow up with the data sources for missing data. For some systems, users may be prevented from further steps in the system unless the missing data is provided.


    The chart shows completeness metrics tracked. As with accuracy, a majority of data systems track the completeness of their data via multiple metrics.

  • As the chart on the right shows, while a majority of data systems track the timeliness of utility data, this aspect of data quality had the fewest counts of the four covered in the survey. For those systems that do monitor the timeliness of their data, the mechanisms are often a combination of automatic flagging or reminders and manual review or outreach.

    For systems that mainly collect or receive data automatically, timeliness may not be as important as some other aspects of data quality, as these systems are already designed to push or pull data at regularly-scheduled intervals. For data entered manually, timeliness becomes a more important aspect of data quality to track.

  • The chart on the right shows lineage metrics tracked. A majority of data systems keep track of the origins of data points. While only about a third of data systems explicitly track the amount of data with metadata, those that do track multiple types of metadata, including the timestamps, meter and/or record types, and units of measurement.

    A significant fraction (8/16) of data systems perform some cross-validation of their data. This consists primarily of comparisons of automatic or manual meter readings against monthly or quarterly invoices.

    Additionally, for the majority of data systems, internal and external audits are performed on the data in the system. Typically, external audits occur on an annual basis, with internal audits occurring more frequently, depending on the type of data under audit.

  • The survey results make it clear that all respondents take ESG data quality seriously. All responses describe measures in place to ensure that the ESG data in the systems is accurate, complete, timely, and traceable. The majority of data systems track multiple metrics of data quality, such as the number of missing meter readings vs. expected, or the expected value of a reading based on past readings. The majority of systems also automatically identify and flag missing or anomalous data, and then follow up with the data sources for clarification or rectification. Half of the data systems cross-validate their ESG data, by comparing, for example, automatic or manual meter readings against monthly or quarterly invoices.

    Overall, the responses to this year’s survey indicate that GRESB Data Partners and Participants not only acknowledge the importance of data quality in ESG data, but also have robust, sophisticated methods in place to ensure that their ESG data reported to GRESB is of high quality.

Feel free to contact us if you have any questions