The success of our decision-making in life can be impacted by the quality of data which is used to inform this decision. High-quality data gives us the confidence to make decisions efficiently and effectively. This could be something as simple as deciding what time you need to wake up in order to make your 9 am meeting. We use data as the basis for making this decision. For example, how long it takes for me to walk to the train station, and how likely it is that my train will be delayed or canceled?
The same goes for the commercial decisions we make. If we consider fund managers, they are under increasing pressure to manage their investments more sustainably year on year, and in order to do this successfully, they require high-quality Environmental, Social and Governance (ESG) data to form the basis of any future management decisions.
‘Garbage in, garbage out’ is the common phrase used to describe how the quality of insights provided is intrinsically linked to the quality of input data. Essentially, if the quality of the data being utilized is poor, then the insights being provided (which the business decisions are ultimately based on) are also likely to be poor.
So, with this in mind, how then can the industry determine the quality of its ESG data?
After looking through several differing definitions on what high-quality data is, it became clear that most definitions point to one thing – ultimately the data must be suitable for the level of the decision being made. Therefore, in order to consider it high quality, data must be robust enough that the user can confidently base their decisions upon it.
This makes it very difficult to quantify high-quality data, as where the importance of the decision rises so does the need for improved data quality. Getting the correct balance between cost-effectiveness and robustness is key in the collection of performance data, but this is a difficult act to achieve and is something which BRE Global have been challenged within BREEAM on many occasions.
Although this may be the case, there are some simple factors which individuals or companies can start to take into consideration when they are collecting their ESG performance data. These can help to provide a better awareness of current data collection processes, as well as some potential quick-wins that can improve the level of confidence in the data being collected. It was these factors which were discussed during the recent Technical Working Group meetings for the GRESB ESG Data Quality standard:
- Is the data collected through a sensor or meter, or has it been calculated based on a series of assumptions?
- Where meters or sensors are used, how accurate is the technology used?
- Where calculations are required in the process of creating data, has a recognized standardized approach been used?
- Are there any fail safes present in the process which help to spot anomalies in the data set that can then be investigated further?
- Are there any missing data points?
- Does the data cover the entire scope you are recording against? For example, is there a utility meter that that is not working or not included in the data set, and the whole picture is therefore not being portrayed.
- How often is the data collected? For example, is this frequently enough that it is easy to spot changes in performance through time, e.g., identification of high energy use during unoccupied hours?
- Is the data accessible within an appropriate period of time? For example, if it takes a month to get access to the water consumption readings it may be weeks until a leak is identified.
- Is it clear where the data has come from? Is there an audit trail, providing confidence on the source of the data?
- Has the data collected been audited? If so, has this audit been undertaken by an impartial third-party?
By simply starting to consider these factors in the data collection process, the industry can begin to become aware of the quality of the data it holds. We are all familiar with the challenges of collecting ESG data consistently within the built environment, and this is made even more difficult due to the wide range of variables that we need to deal with (including differing construction types; asset use types; servicing solutions; accuracy levels of data collection technology (such as sensors); management arrangements; lease types . . . [take a breath] and the list goes on). Therefore, anything which we can do as an industry to introduce clarity, transparency and consistency in the way ESG data is collected should be promoted, as this opens the potential for improved comparisons and benchmarking.
As mentioned above, GRESB is undertaking great work by helping to bring the industry together to discuss these challenges through its Technical Working Group. This has led to the release of the GRESB Data Quality Survey which aims to establish the current landscape of ESG data quality, which in time will help to develop the creation of a Data Quality standard.
With the ever-increasing ability to collect more data than ever before, it is extremely important for the industry to come together to deliver consistently high-quality ESG data. In our attempts to reach this we do need to be careful that as an industry we are all moving in the same direction. Therefore, I would strongly encourage all organizations to continue to discuss these challenges together, rather than proceed in isolation.
Within our BREEAM certification products, BRE Global has always strived to align to robust existing industry standards, providing clarity and consistency to the marketplace, while reducing the need for duplication of work. This includes the standardization of ESG reporting requirements, and we want to continue to work with the industry to drive this forward. BRE Global would be delighted to be involved with future initiatives, and to discuss this further please contact us at: email@example.com
This article is written by Daniel Skidmore, BREEAM Digital Lead