Unequal access to mortgage lending - discrimination in a transforming industry
For many, having a house that they can call their own is a fundamental part of the personal life plan. It is often seen as the basis for founding a family and as necessary for membership in a community, especially in rural areas. On the economic side, homeownership is by far the most important source of wealth accumulation for households – and oftentimes the only one [1]. These arguments only emphasize what should be clear from the outset: access to homeownership must be equitable and free of discrimination. Unfortunately, a large body of scientific evidence suggests that this is often not the case.
In order not to go beyond scope, we focus here only on one aspect, which is, however, central to almost every house purchase: mortgage lending. Moreover, we examine the issue of ethnic or racial discrimination and abstract from discrimination based on gender or sexual orientation, which is less well understood in the housing market context.
Depending on the country, it is common practice that 80% or more of the payment for a house are covered by a mortgage loan. In the Netherlands, the official limit on the so-called “loan-to-value” ratio is as high as 100%. Naturally, before extending loans of such scale, banks require a comprehensive check of a borrower’s financial solvency. This typically includes information on income, wealth and employment. In many cases, however, banks have illegally also considered an applicant’s race or ethnicity in their decision: an influential study from the United States, conducted in the 1990s, reported that among applicants with similar financial characteristics, Black or Hispanic candidates were around 8 percentage points more likely to be rejected than Whites [2]. In another study, carried out in the early 2000s, professional actors of different racial and ethnic backgrounds were sent to inquire about mortgage information at different institutions in the pre-application stage. The results were, once again, striking: minority “applicants” received less information about mortgage products, were not given the same amount of coaching and were less likely to receive a follow-up meeting, compared to Whites [3].
Evidence of this kind should elicit political or legal responses. And indeed, some lending institutions had to face consequences for their discriminatory practices, as a $175 million fine against the major US bank Wells Fargo shows [4]. However, from a current point of view, interpreting the historic evidence presented above is problematic for a simple reason: the mortgage industry has been digitized in the last two decades. In the 20th century, a Black family father applying for a mortgage ran the risk of being matched with a racist loan agent, who decided to reject the application purely due to his personal bias. In 2022, this loan agent is no longer employed. Instead, loan approval decisions are based on so-called credit scoring algorithms, which evaluate an applicant’s creditworthiness mechanically, based on the financial input variables. In fact, some new providers even offer an online application process that involves no human interaction at all [5].
The question then becomes, of course, whether digitization puts an end to unequal access in the mortgage industry. Unfortunately, the answer might be no. In a recent study, researchers have compared several credit scoring algorithms in how they evaluate applicants from different racial or ethnic groups. It turned out that the most technologically sophisticated algorithms were also the most unfavorable for minority applicants [6]. Interpreting these results is delicate, however, and reflects a dilemma that exists in various industries: can the decisions made by algorithms that don’t have access to racial or ethnic information nonetheless be labelled as discriminatory? Banks would veto any such accusation by claiming that the algorithms which they employ simply propose decisions that are perfectly rational from a financial perspective, by minimizing the likelihood of a loan default.
The increasing complexity of digital technologies in the mortgage industry has thus also increased the complexity of detecting discrimination. A clear-cut solution to this issue is elusive. However, by way of conclusion, two measures seem particularly relevant: first, policymakers need to devote resources in order to understand the risks of credit scoring algorithms and design sensible anti-discrimination legislation for the digital era. The European Union took a step in this direction within the framework of the General Data Protection Regulation (GDPR), although its effectiveness with respect to anti-discrimination has been contested [7]. Second, banks and other lending institutions must be obliged to cooperate in preventing discriminatory outcomes. This can be achieved by disclosing their credit scoring algorithms or by conducting so-called algorithmic audits, in which the performance of algorithms is thoroughly tested, especially with respect to its treatment of protected groups.
Discrimination in lending has proven to be a complex and evolving issue. While there is hope that some of the most blatant forms have been eradicated, more subtle forms may still prevail. It is clear, however, that the prospect of allowing equitable access to the dream of homeownership is worth any effort.
References:
[1] https://www.ecb.europa.eu/pub/...
[2] Munnell, A. H., Tootell, G. M., Browne, L. E., & McEneaney, J. (1996). Mortgage lending in Boston: Interpreting HMDA data. The American Economic Review, 25-53.
[3] Ross, S. L., Turner, M. A., Godfrey, E., & Smith, R. R. (2008). Mortgage lending in Chicago and Los Angeles: A paired testing study of the pre-application process. Journal of urban Economics, 63(3), 902-919.
[4] https://www.reuters.com/articl...
[5] https://pressroom.rocketmortga...
[6] Fuster, A., Goldsmith‐Pinkham, P., Ramadorai, T., & Walther, A. (2022). Predictably unequal? The effects of machine learning on credit markets. The Journal of Finance, 77(1), 5-47.