Bart: Are you feeling worried about the looming deadline for FASB and IFRS lease accounting compliance? Well if you are, you’re not alone. Welcome to Tango’s Road to Lease Accounting Compliance podcast series, where we’ll be discussing the steps a company needs to address in order to ensure your organization is ready. We’ll be covering everything from setting strategy to understanding policy, handling data requirements, picking the best technology, and finally, institutionalizing the right processes and controls to ensure that they represent a permanent shift in the way you account for leases.
In today’s Podcast episode, FASB and IFRS Data Traps, we will be discussing common data traps companies encounter as they journey down the road to lease accounting compliance. We’ll also pass on some hard-earned lessons and strategies around data management and how you can make sure your lease data is accurate, complete and clean.
I’m joined today by Aman Gill, Vice President of Management Consulting here at Tango. Aman has more than fifteen years of consulting experience, majority of which has been spent in the Integrated Workplace Management Systems, or IWMS space. Given his responsibility here at Tango for end-to-end client delivery, Aman has run some of the largest leasing implementations in the world, including two Fortune 100 deployments, one covering more than 200+ countries and the other spanning more than 60 countries. So suffice it to say, Aman has a wealth of experience he can share regarding data traps and how to avoid them.
Welcome Aman.
Aman: Thanks, Bart. It’s a pleasure to be here with you today.
Bart: Hey well thanks. Okay before we jump in, let’s look at some of the revealing metrics about data and the new lease accounting standards. EY conducted a survey of more than 125 companies, spanning 20 industry segments, to gauge readiness for the new FASB and IASB lease accounting standards. Not surprisingly, a full 66% of respondents in that survey stated that quote-unquote ‘creating an inventory of all the necessary data points was going to be moderately to significantly difficult.’ Also, PwC and CBRE conducted a 2017 lease accounting survey, which pulled more than 600 respondents in this case, across 13 industry segments, and in that survey, data collection was rated as the most difficult activity with 76% of respondents saying it would be somewhat to very difficult to accomplish.
So data, and the complexities and risks that come with it, may likely be the number one challenge facing companies as it relates to the lease accounting compliance. We don’t have enough time to cover all of the likely FASB and IFRS data traps, but I would like to focus in on some of the most common issues before we turn our attention to the use of proactive data strategies and approaches that will help reduce the complexity risk in compliance timelines.
Okay, so Aman let’s jump into this, let’s tackle the first data trap, which is really the fact that data is likely stored in multiple decentralized databases or physical files.
Aman: One of the things is, from a lease compliance best practice standpoint, is to have all of these contracts, be it real estate leases, equipment leases, embedded service contracts, in a single lease administration and accounting system. What I’ve seen over the last couple of years as we’ve been working through this, is that that’s not really the case for most companies. From a real estate perspective, most companies are in a pretty good place, because they’ve been using a purpose-built system to manage the leases, the real estate leases, that is and they’ve got focus departments that work on the real estate side. But the equipment leases is a much bigger challenge, because usually it’s not centralized within a department, and therefore they’re using different methods of cataloging and managing these leases be it access databases or excel files, et cetera.
So this really becomes a big issue from an accounting perspective, what we’ve been seeing is that most of the companies out there are trying to come up and consolidate into single departments or local offices to manage all this. So the best practice here would really be to come into one system for all of these types of leases, be it real estate or non-real estate or embedded leases, get the specific data sets, invest the time necessary to map all the data elements into that centralized database, so that from an accounting perspective, you’re in a better place.
What we’ve seen is that likely you’ll need to abstract many of these leases so either engage a partner or internally go through your list of all of these contracts and physically abstract them into the new tool. We’ve seen this with a customer doing a global implementation where we took one tool as a kind of a central repository where all of the multiple-point system data was brought in. It was then transformed and put into the target system.
Another example of just around this realm, is where some companies are using quite antiquated leasing administration systems, so they don’t have the data from an accounting perspective. That also becomes a big challenge as we try to merge into a system that is built for lease accounting. So those are a couple of things to look at, what you’re going to find is that it’s going to be an ideal case for physical lease extraction.
Bart: Yes, I would definitely see that as one of the major challenges across of our implementations for sure.
Okay, let’s move on to the next and closely related data trap, this one being the identification of embedded leases and the capturing of all the necessary data, whether it’s from embedded leases, equipment or other real estate leases, to drive FASB and IFRS lease accounting calculations. The new regulations are bringing all of these lease obligations on the balance sheet, and in order to comply, companies have to sift through all the contracts at the enterprise level and understand which of these contracts are leases and aren’t leases. I know that kind of throws a wrinkle in things and is hard for folks. So why don’t we talk a little bit about that, Aman.
Aman: Yeah, you’re right Bart. This is a problem we’re running into with all clients as they need to assess all the contracts they have in their portfolio today. Our recommended best practice in this case, which has worked well and successfully for many clients, is to develop a standardized contract assessment protocol, or what we call a decision tree. Doing so will drive consistency in identifying leases and help create an inventory of leases you’ll need to gather the data around. Once you’ve done that, the next step is to set up a standard data structure for gathering or abstracting all the lease data, be it the real estate leases or the asset leases.
The good news is that the equipment leases and embedded leases, rather, require less data than real estate leases, so we’ve seen kind of, although the volume may be greater, the amount of work required is less than the real estate side. The bad news is, though, that there may be more of them, and so you need to identify them all through a manual process, which we’ve seen can be a challenge, just from a labor perspective; some companies aren’t staffed up to handle it. We have seen some customers who have operations in many countries with less rigid protocols for document control. What we’ve seen there is this is a big challenge.
We’ve got an example for one customer where we were in a final stage of testing and they found a fairly significant lease that made up a big share of the monthly-based rent, but it wasn’t in their existing Excel system for tracking. And so it was identified really late and was leading to some of the reconciliation differences between payments. So it’s good to get started early, like I said, and follow that protocol and standard as you go through this.
Bart: Right, and that contract assessment protocol or I think decision tree, as you called it, likely is going to have to live beyond the initial compliance period as companies need to operationalize compliance. A lot of companies we’re talking to are looking at taking that decision tree and that assessment of each contract and inserting into the procurement process, so that you can catch things early on to determine if it’s a lease or not, and therefore get into the right process to get it accounted for and managed accordingly, so that makes sense.
Alright, well let’s tackle a few more of these and for argument’s sake, let’s fast forward to a point where you’ve got all the data kind of ready and staged and now you need to extract that data from the source systems, likely transform it, and then load that data into the new lease administration and accounting system. In this case the volume is going to be pretty substantial because we’re talking about hundreds, if not thousands, of leases and then each lease itself having upwards to a 100+ data points each, so a lot of data. So this obviously opens up a potential data trap or one that we’ve seen in particular around avoiding load size and performance consideration. So, what are your thoughts on that?
Aman: You know Bart that’s a great point, load size and performance, when I think back to a lot of customers we’re working with, we’ve got some customers loading twenty, thirty, forty-thousand contracts, so substantial numbers; and when you take into considerations some of the historical lease data, such as past payments, reconciliations, and so forth, it can really increase the amount of data exponentially. So in our experience when you’re loading this much data, what you need to do is have a detailed plan and strategy going in, and I’ll talk about that a little bit later as well. But the goal is really to ensure that you have accurate and fast data loads with a low failure rate, and by low failure rate, what I mean is that we don’t want to load garbage data in, we want to make sure what we’re loading is good for testing and that we’re not going to have wasted cycles with data that really makes no sense for the users. So the best practice here is really to define the dataset that you absolutely need for go-live and potentially after go-live, load incremental ancillary data if possible. And by ancillary, I mean things like comments, documents, things you don’t really need Day 1. For example, you need payments Day 1, right? But some of this other stuff you don’t.
Another best practice is to validate the extracted and transformed data prior to loading it into the target system, so that you can avoid wasted load cycles with bad data. Typically, we target for a 90+ percent data quality index using our data management methodology. So what that means is that we want to make sure that the extracted and loaded, or transformed data, rather, is at a quality where you feel comfortable loading it into the target system so that you’re not going to waste time in testing. Of course, we can’t discount the need for reliable and scalable load environment and by that I mean that the target system and the infrastructure that it’s sitting on is very important. We had an example with a customer where a load was taking almost three times as long as it should, and by scaling up the CPU, the RAM, and the IO on the infrastructure, we were able to reduce that time significantly and achieve timelines we were looking for.
Bart: Well that leads logically to the next data trap, which is playing data catch up during the cut over period when you’re moving from that legacy system into the new system. I know this is a high-risk time within the project. What can do here to kind of focus on reducing challenges and risks?
Aman: Well yeah, you’re right Bart. That period of time is rife with complexity and risk and the thing that really keeps people awake at night as we get near the end of a project. This time period, we call this ‘delta management’, which is basically the process of managing that changing data from the cut over period, from the time you take your last extract to the time you go live with that load, and managing that data change in between. You really need to think about this and plan it out to reduce and eliminate that risk of overriding that the correct data with bad stale data. Doing so would comprise the lease accounting calculations and ultimately delay the compliance timelines, not to mention create a huge amount of manual work.
So to combat this, we utilize a standard set of tools and reload strategies to ensure a smooth cut over, we’re doing delta tracking to make sure that we are capturing the changes, making sure all the people that are using this system and this data are involved in planning and coordination across departments, ensuring that the legacy system is locked down so that no one’s making changes in there except for a select group of people. And you can go into different types of delta management processes, i.e. you can do another load right at the very end for those changes, or in some cases we’ve had customers say, ‘no, we have a very small volume, we’re going to make those changes right in the new system.’ So it really depends on the complexity of data a client has and the volume of change.
Bart: That’s a perfect segue to our next portion of the podcast where I’d like to unpack data management and have us talk about the criticality of developing an agreed upon approach and strategy in this area. And as I said at the onset, we’ve developed some very hard learned strategies, I know from talking to you, that we now use on all our engagements.
So why don’t you explain to our listeners what we mean by data management in this context?
Aman: Sure Bart. So data management in this context is really a process that covers four main areas. The first is source data extraction and mapping. The second is transformation. The third is data loading, and the lastly, data validation; and they kind of build on each other.
So I’ll start with data extraction and mapping. As the name suggests, this is really the heavy lifting of taking the data out of the multiple legacy source systems and with the new lease accounting standard, this could be dozens of systems, depending on the customer’s unique situation. And then taking that data and figuring out what you have, what you need in your new system, and mapping fields one-to-one and deciding on what is the extraction rule that’s needed. Do you need to change the format of the feel, do you need to add something you don’t currently have, et cetera.
A big part of this process is also the data quality analysis and clean up and we’ve developed several tools in this arena to help with the clean up and analysis. A mistake in this stage could really submarine the entire compliance project and you know we’ve had examples with customers where over the course of the year, of years I should say, there may be leases in different cities for example–oh sorry, the same city–but the city is spelled differently. So you run into small things like this which, on the face of it, seems like a small issue, but can really be a big problem as you go into a new system and you’re trying to map the data over. So that’s one of the things there.
The next part of data management is transformation. Often data needs to be normalized across the multiple-source systems or transformed to match the design of the new target system. So here again, Tango, we’ve developed different tools to help stage that data, run it through transformation protocols, and perform all the necessary checks and balances to make sure that the data is being transformed into the type of quality or format that we need in the new system.
And then once that data is transformed, the next step is loading it into a new system. As I mentioned before, the volume of the data can really impact system performance, so it’s important to make sure that the system is being analyzed as you’re loading data, you’re staggering data loads, and try to minimize the impact on the system and make sure that you’re tweaking infrastructure; all these little things that go into making sure that the system is handling the data. And again we’ve developed some tools around this where we can load data 24 by 7 because we’ve got some automated tools that monitor these things for us.
And then finally the last stage of data validation is, sorry, data validation; and that’s the process of ensuring that the data quality is there and is at a high level. And really there’s four characteristics to making sure it’s quality data. One, is it accurate? Two, is it complete? Three, is it clean? And four, is it measurable? Can we actually assess the data and see, yes, this data is what we should have, and we can check off that we’ve reconciled the data and if there was a million dollars of payments in the source system, there’s a million dollars of payments in the new system. And if you don’t have all four of these qualities or characteristics, you won’t have quality data in your system.
Bart: Well it clearly sounds like you’ve got a well-oiled machine when it comes to data management, which I know is a must-have. So as we wrap up the podcast, I was wondering if you can basically leave our listeners with some best practices in addition to what we’ve talked about today.
Aman: I sure can Bart and just before I do that I just wanted to say that if there’s any listeners or any customers out there that’d like to have a more in-depth discussion with me or with someone from my team, they can reach out. Like I said before, we’ve been in this space for a long time. We’ve built some proprietary tools and we’ve got a full portfolio of strategies that we can bring to the table.
But getting back to some best practices to wrap up with, I think the first one I would say is making sure that responsibilities are clearly articulated and agreed upon between the client and the vendors. Your partner will be responsible for certain activities, your IT department will be responsible for other tasks, and the business will have their own skin in the game as well. So your statement of work should clearly document all the responsibilities and assumptions and if you are migrating from a legacy system, which you don’t own or support, you’ll need to review the terms of your agreement as we’ve seen far too often, our customers are at the mercy of a third party to get their data in meaningful output format. And that third party doesn’t really have the incentive to help since they’ll be losing a customer.
The second thing I’d say is that identify data gaps early on. So this should be one of the first priorities that you look at and it could even start prior to implementation, making sure you’ve got all the data that you need from it, in a lease administration and from an accounting standpoint, because you may need to engage in abstraction partner, as we talked about earlier or create data that you don’t have today.
And I would say along the lines of data management, utilizing a robust set of data quality tools, including data quality profiler tools, engage in review workshops with the business, make sure you’re doing some sort of data quality indexing and automated validation scripts to make sure that the data is of the most accurate and high quality. These steps will reduce the amount of wasted effort and also user fatigue that we too often see during the testing phase.
And then lastly, I’d say make sure you understand and plan for the dependencies between your data. So a lot of times when you see that there’s certain types of data that are reliant on other data sets, so make sure that’s all preplanned and that you know when it’s needed because all of that will affect the staging of the data and the timing of loads, et cetera. A well-planned approach will pay major dividends so we’ve seen that every time we engage a project like this.
Bart: Yeah, well you can’t overstate that and you’ve clearly given us a lot to consider in our listeners to think through. Based on both the survey data that we discussed initially and our own experiences, helping clients down this road to compliance it’s clear that data is the number one challenge and the highest risk area that companies are facing. So be sure to check out Tango’s road to compliance webpage for additional tools and tricks of the trade, if you will, as it relates to data and the other stages of the overall road to compliance.
Well thanks, I’d like to thank Aman Gill for spending time today sharing his experience and his lessons learned. I’m sure our listeners have found it to be very valuable, I know I did, so thanks Aman.