Mortgage

It all comes down to trust in data

Here’s one for you: How many fintech companies does it take to reduce the cost to originate a loan? There’s no great punchline to throw in here — only the sobering fact that it cost a record high of $13,171 to originate a loan in the first quarter of 2023, according to Fannie Mae’s recent lender sentiment survey.

Still, the FHFA set out to answer this question by gathering more than 60 companies together in Washington, D.C. for the inaugural Velocity TechSprint in July. This hackathon of sorts had a single goal: determine the best solution to effectively use data to reduce loan cost and make lending more attainable and fair for consumers.

I was one of 80 participants assembled together in 10 different teams. Each team contained a cross section of lenders, fintechs, consultants and others united by a common goal. We had three days to engineer a viable solution to some of the biggest challenges in lending — oh, and boil down those three days of solutioning into a five-minute pitch in front of industry judges. No pressure there.

This was very much an exercise for optimists, innovators and maybe even dreamers. Loan costs have risen every quarter since the first of 2020. So what could really be done in three days that hasn’t been tried in three years? We have watched the record rise and fall of investment in technology solutions over the past few years amidst record loan volumes, many of which promised to automate a better borrower experience and deliver shorter loan closing times. But the stubborn fact remains that transformative change has yet to materialize.

Still, that evidence made the idea of locking arms with industry leaders and working with competitors even more compelling. People arrived ready to ideate, compete and cooperate.

A few things became apparent within our team on the first day of working together, and we were ready to attack every aspect of the loan lifecycle to make it better. From consumer financial readiness to loan servicing years after close, everything was on the table. Our knowledge of how the whole thing fits together was exciting. To make a big impact, we have to have a big solution, right?

It wasn’t until the second day that the truth finally became clear: We only had time to flesh out and describe one good idea, not the more than five we had packaged into one big solution. As the team debated and consumed an intense amount of coffee, it became clear that every solution idea we wanted to build upon was lacking one key component that had yet to be solved for the industry. There was a lack of data trust.

Lenders and other stakeholders spend countless hours checking, verifying, rechecking and reverifying the same data over and over again. That data usually comes in the form of a document, which gets sent around to various stakeholders, sometimes with accompanying structured data, sometimes not. Data gets re-extracted over and over again. There’s an industry-wide inability to easily understand whether the data or document has changed since the last time it was checked, and to understand if that data is from the original, direct source.

The mortgage industry loves using the phrase “checking the checker,” because this is common practice even when GSE automated underwriting systems are fully in use. Our team set out to solve this data trust issue and give lenders a way to check an authoritative source to verify if data has changed since last delivery, instead of having to reverify all the data from scratch again.

It turns out we were not the only team that arrived at this conclusion — at least half of the pitches featured some aspect of data trust. Whether the focus was on providing better ways for consumers to control and securely share their own financial data, or on enabling lenders to more efficiently consume new alternative sources of credit data, data trust was a central theme.

There were some strong cases made for the use of blockchain and NFTs to provide a tokenized way of securely sharing and trusting consumer data, but in the end it wasn’t the lack of technology that was identified as the biggest speed bump, it was the lack of standardization and central authority.

Which leads us to one of the most surprising themes of the week: fintechs asking for government involvement. There seemed to be a common realization that a healthy cooperation between the public and private sector was needed to create a major change to the status quo.

Yes, I realize that the event was hosted by the FHFA, so maybe this isn’t surprising. The cold hard truth is this: The need for centralized data trust in an ecosystem as complex and regulated as the mortgage industry is beyond what any one innovator can bring about quickly. Some sprint teams proposed cooperatives with public and private organizations, while others asked for outright government agency and mandates.

I was reminded of the recent plea from generative AI companies asking for government regulation. Interestingly enough, reducing the risk of bias in generative AI models comes down to data trust as well, so it appears we are onto something here.

I came away from the event encouraged once again in the mortgage industry’s willingness to work together to solve big problems, but the real test is what happens next.

Leave a Reply

Your email address will not be published. Required fields are marked *

Most Popular Articles

Latest Articles

2024 is not the year to cut corners on staging — here’s why 

With home prices reaching unprecedented heights and interest rates soaring, the discerning nature of today’s buyers requires all agents to employ every possible advantage. Simply put, cutting corners on staging is a risky move that risks prolonged market presence.

3d rendering of a row of luxury townhouses along a street

Log In

Forgot Password?

Don't have an account? Please