top of page

Make sure your data models are transparent and not a black box; LPs look to asset cashflows and occupancy.

NAREIM Data Strategy Meeting,

May 18, 2021


There is an increasing focus among investors and consultants on asset-level data, to help them understand their exposures as well as to better predict their own capital calls and conduct manager underwriting.


During NAREIM's Data Strategy meeting this week, investors and consultants spoke with NAREIM members on their own reporting processes, how they utilize the data – and why they’d like to dig deeper into asset-level information, including projected cashflows, occupancy and unlevered returns.


One key piece of feedback was that asset-level data helps investors refine not only their own pacing models for future capital calls and distributions, but it is also about risk management and reporting to their own internal risk teams.


The Data Strategy Meeting, held on May 18, also heard from managers about new tools they've adopted in the wake of Covid, and how they're increasing user adoption. The biggest takeaway though was about trust in the data and models:


Data has to be about trust and transparency, the meeting heard, with all NAREIM members need to ensure their models - and the data underpinning the modeling – are trusted by the business user and don’t become a black box. If there’s no trust in the type of data being used and the models themselves, the tool won’t be used. Read more on the case study below.


Download the presentations and meeting attendee list, here.


Key highlights also included:


  • Importance of data sandboxes. During a case study into tenant exposure tools, one member described how they’d utilized sandboxes – protected, shared environments where models can be built and experiments conducted without harm to wider enterprise databases – as a critical component to not only prove the case for a tool or data set, but also to get sign-off and resources to develop the resulting dashboard.

  • Tenant health. Can data be used to better predict tenant health? Credit worthiness scores can often be outdated, and the key takeaway from the Data Strategy meeting was that asset managers are the best tool to understand and evaluate tenant health through regular engagement. But there was a value of marrying that high-touch process with data, including mobility data from phones, credit card data and payroll data such as from ADP, particularly for property types such as multifamily.

  • Don’t forget the people as part of any data strategy or project. You cannot get where you need to be without addressing change management – and how your people will be impacted and react to new ways of working.

  • Create data stewards in your analyst pool. One key challenge is maintaining clean data and correcting errors when they occur. As well as education around the impact of unclean data – ie, if you don’t update your data following an alert for one or two months, this is the team member you’re effecting – also look to data stewards, people on the frontlines who know the information and the business process. Even if they cannot update it themselves, they know who to contact to get it done. Data stewards are also the people on the ground analyzing when something new is needed, what to update and what needs checking.

  • Leveraging PowerBI to create dashboard functionality to all team members. The firm used off-the-shelf products for specific group functions, such as Yardi, VTS, Juniper Square, which fed into a data warehouse. PowerBI was layered on top to allow more employees to gain access to the data, which during Covid pivoted to a focus on rent collections. Asset management teams in particular were able to immediately see lease rolls, daily rents, NAICs code compositions. The next focus is on how to utilize and standardize tenant interviews conducted at acquisition – and containing a plethora of valuable asset and property management information – into current systems, as well as vendor management.

  • “It’s a constant work in progress,” the firm said.

  • Create testable rules for your data. You know what you expect from your data, but do you have testable rules that allow you to compare your expectations against actual performance and delivery? Create rules that allow you to test for timeliness, completeness, accuracy and consistency.


Case study deep dive: How to use data more effectively in terms of underwriting and the deal pipeline and investment committee process:


During a case study review, one member presented a tool that used new and existing data to better forward predict the 3yr rent growth of micro locations.


Definition:

Micro locations were smaller markets or sub-markets within large metropolitan areas not based on MSA designations but based on 1.5sq mile hexagons (the equivalent size of 2 football fields) that were aggregated together in the tens of thousands to create micro locations.


Goal:

The goal was to transform data models from a linear to a more non-linear relationship, thereby creating a much more refined tool, in terms of its ability to more accurately determine performance within smaller locations/markets and use more inputs to reflect changing trends in a specific area. Data utilized included IRS, Zillow, Census information, etc to cover population, employment and income trends.


Takeaway:

However, it’s all about trust and transparency and ensuring the models – and the data underpinning data modeling – are trusted by the business user and don’t become a black box. If there’s no trust in the type of data being used and the models themselves, the tool won’t be used.


The tool is in its early stages and can help transaction teams understand the best performing and worst performing micro-locations – and give more insights into those areas and assets in the middle ground. It’s also highlighting where other, new data sources, such as Yelp reviews, could be helpful.

bottom of page