By: Cyrus Mostaghim

DISCLAIMER: While the author is an employee of the Consumer Financial Protection Bureau (“CFPB”), this article’s contents reflect the author’s thoughts as a private citizen, not as an employee or representative of the CFPB.  The author wrote this article using only publicly available information and without the use of any CFPB resources.  The article’s contents should not be interpreted to be associated with the CFPB in any manner or construed in any way to be a representation or statement from the CFPB.

Recently, the Student Borrower Protection Center (“SBPC”) released a damning report about a Fintech company claiming that Upstart’s Artificial Intelligence (“AI”) engaged in discrimination through the use of non-traditional lending data.[1]  In the report, SBPC asserted that Upstart’s AI charged more interest to hypothetical individuals who attend Historically Black Colleges or Universities and other minority-serving institutions than individuals from a lesser minority school like NYU.[2]  In reality, someone from SBPC posed as the hypothetical individuals submitting the online application to collect the data for SBPC’s report.[3] 

Upstart was the first entity to receive a No-Action Letter (“NAL”) in 2017 from the Consumer Financial Protection Bureau (“CFPB”).[4]  In this arrangement, Upstart regularly reported lending and compliance information to CFPB that allowed both parties to better understand how Upstart’s AI works in the real world.[5]  Additionally, CFPB stated that it did not intend to initiate any supervisory or enforcement actions against Upstart for violation of the Equal Credit Opportunity Act.[6]  While Upstart and CFPB assert issues with SPBC’s claims and methods of obtaining data, SPBC’s claims are noteworthy because the claims encompass the financial industry’s issues with the general dangers and concerns regarding FinTech and regulation.[7]  These concerns include the stifling effect of regulation on innovation, the cost-benefit tradeoff of using AI to innovate financial services, and how using non-traditional data can lead to proxy discrimination.[8]

The business and financial worlds are inextricably intertwined.[9]  Businesses function atop the financial system with both depository and non-depository financial institutions facilitating access to capital and credit through a system that maximizes investment returns.[10]  Unlike with regular corporations, financial institutions are subject to additional, more restrictive, regulations.[11]  Regulations unique to finance generally focus on ensuring the financial system’s safety and soundness or fair lending and consumer protection.[12]  Sometimes, these regulations can create barriers to entry and hinder innovative advancements because institutions are worried about potential penalties from regulators. [13]   The FinTech industry is an example of this.

AI in finance started with machine learning algorithms that did basic rule-based decisions and created efficiencies in the lending process for both consumer and commercial products.[14]  Some examples include loan qualification amount and interest rate range using the same data points and procedures as a human would.[15]  Now, AI can find new and better correlations between the traditional data points for a decision.[16]  AI is even using non-traditional data, like Upstart’s AI, to make inferences and decisions about consumer and commercial financial services.[17]  The hope is that AI will increase access to finance, thus contributing to economic growth.[18]  However, using datasets of human decisions is problematic because AI is only as good as the data used to teach the algorithm.[19]  Additionally, AI can make improper data correlations, like when Facebook’s AI moderator removed legitimate posts about the coronavirus, that unintentionally discriminate.[20] Even worse, discrimination can happen if an engineer’s unconscious biases exists in the AI’s coding, if discrimination is intentionally built into the programming, or if the coding is hacked to discriminate.[21]  Proxy discrimination happens when a decision is based on factors that are not protected statuses and, in financial regulation, the institution can be liable for damages.[22]  This liability exists because the sovereign responsibility theory makes an algorithm’s user accountable for the legal liability the algorithm creates.[23] 

The CFPB created its NAL policy, that it pioneered with Upstart, and a regulatory sandbox to address the financial industry’s concerns of liability from innovation.[24]  The regulatory sandbox appears to have evolved from the NAL policy.[25]  While sandboxes can be constructed in different ways to encourage innovation, CFPB’s sandbox provides participating financial institutions certain safe harbor protections from statutory violations during the testing period, that mitigates fears of liability costs, and requires specific data reporting.[26]  The specific data assists CFPB in understanding how the AI works since regulators cannot audit an AI’s programming.[27] 

Programs like CFPB’s sandbox are necessary to mitigate industry concerns about the liability in developing AI; as long as the sandbox ensures a specific goal that is not just encouraging innovation.[28]  The U.K.’s sandbox structure, adopted in 2016, is credited as a significant factor for London becoming the preeminent FinTech hub in the world.[29]  However, as of 2017, some U.S. federal regulators indicated that it was unlikely that the U.S. would adopt a structure similar to the U.K.’s sandbox.[30]  If U.S. officials have not changed their stance, they may want to reconsider or extract some best practices from the U.K.’s model to incorporate into the various U.S. sandboxes.[31]  Additionally, a regulatory sandbox should not interfere with consumer protection or financial stability.[32]  At minimum, the sandbox should require remediation for the direct effects of the AI against an impacted individual, i.e., reduction of interest rate, crediting of overpayment towards the principle, and recalculating the remaining balance.[33] 

Without sandboxes to mitigate the fear of the legal liability from enforcement action or civil penalties, or private claims when a right exists, the fear will stifle innovation.[34]  This fear is justified because the cost of damages from a violation can bankrupt a startup.[35] Additionally, without sandboxes, businesses and regulators lose a tool that helps them learn how AI functions in reality, and consumers and businesses have less access to finance that impacts the economy.[36]


[1] Student Borrower Protection Ctr., Educational Redlining, 15-19, (Feb. 2020), https://protectborrowers.org/wp-content/uploads/2020/02/Education-Redlining-Report.pdf [hereinafter “Upstart Report”]. 

[2] Id. 

[3] Lender Disputes Accusations of Discrimination Based on College, Am. Banker Ass’n (Feb. 09, 2020), https://asreport.americanbanker.com/articles/lender-disputes-accusations-of-discrimination-based-on-college [hereinafter “Upstart Claim Dispute”]

[4]  CFPB Announces First No-Action Letter to Upstart Network, Consumer Fin. Protection Bureau:  Newsroom (Sept. 14, 2017), https://www.consumerfinance.gov/about-us/newsroom/cfpb-announces-first-no-action-letter-upstart-network/ [hereinafter “Upstart NAL Announcement”]

[5] Id. 

[6] Id. 

[7] Upstart Claim Dispute, supra note 3.  See Upstart Report, supra note 1 (demonstrating proxy discrimination via claims); Hilary J. Allen, Driverless Finance, 10 Harv. Bus. L. Rev. (forthcoming 2020) (manuscript at 13), https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3366016 (discussing business efficiencies of AI algorithms) [hereinafter “Allen Driverless Finance”]; Hilary J. Allen, Regulatory Sandboxes, 87 GEO. WASH. L. REV. 581 (2019) (discussing FinTech’s role in financial industry and cautioning against promoting innovation via financial regulation only for sake of innovation) [hereinafter “Allen Regulatory Sandboxes”]; Frank Pasquale, Data-Informed Duties in AI Development, 119 Colum. L. Rev. 1917, 1925 (2019) (discussing development and advancement of AI algorithms); Anya Prince & Daniel Schwartz, Proxy Discrimination in the Age of Artificial Intelligence, 105 Iowa L. Rev. (forthcoming 2020) (manuscript at 12, 18-19, 29), https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3347959 (defining proxy discrimination as when discrimination against a protected class happens based on factors that are not protected statuses, and providing AI’s history, evolution, general business impact, and risks) [hereinafter “AI Proxy Discrimination”]. 

[8] See Upstart Report, supra note 1 (demonstrating proxy discrimination via claims); Allen Driverless Finance, supra note 7; Allen Regulatory Sandboxes, supra note 7; Pasquale, supra note 7; AI Proxy Discrimination, supra note 7.

[9] See John Armour Et. Al, Principles of Financial Regulation 24-26 (1st ed. 2016) (discussing multiple roles financial institutions play in business and the economy). 

[10] Id. 

[11] See Hilary J. Allen, Financial Stability Regulations as Indirect Investor/Consumer Protection Regulation:  Implications for Regulatory Mandates and Structures, 90 Tul. L. Rev. 113 (2016) (arguing that the purpose of financial stability regulation is safety and soundness of the financial system). 

[12] See id.  See generally Real Estate Settlement Procedures Act of 1974 (RESPA), 12 U.S.C. §§ 2601-2617; Home Mortgage and Disclosure Act (HMDA) of 1975, 12 U.S.C. §§ 2801-2810; Community Reinvestment Act (CRA), 12 U.S.C § 2901; Truth in Lending Act of 1968 (TILA), 15 U.S.C. § 1601; Fair Credit Reporting Act (FCRA), 15 U.S.C. § 1681; Home Ownership and Protection Act of 1994 (HOEPA), 15 U.S.C. §1602; Equal Credit Opportunity Act (ECOA), 15 U.S.C. § 1691; Fair Housing Act (FHA), 42 U.S.C. §§ 3601-31. 

[13] See Allen Regulatory Sandboxes, supra note 7 at 579 (discussing FinTech’s role in financial industry); Hilary J. Allen, Sandbox Boundaries, 22 VAND. J. ENT & TECH. (forthcoming 2020) (manuscript at 4), https://ssrn.com/abstract=3409847 (discussing creation of sandboxes to address Fintech’s concerns and promote innovation) [hereinafter “Allen Sandbox Boundaries”]. 

[14] See Allen Driverless Finance, supra note 7; AI Proxy Discrimination, supra note 7 at 18-19 (providing history of AI algorithms, evolution, general business impact, and risks). 

[15] See id.

[16]  See Upstart Report, supra note 1; Allen Driverless Finance, supra note 7; Jack M. Balkin, 2016 Sidley Austin Distinguished Lecture on Big Data Law and Policy:  The Three Laws of Robotics in the Age of Big Data, 78 Ohio St. L.J. 1217, 1239 (2017) (expressing concerns about AI and its impact on society); AI Proxy Discrimination, supra note 7. 

[17] See id.

[18] See id.

[19]  Chris DeBrusk, “The Risk of Machine-Learning Bias (and How to Prevent It)”, MIT Sloan Mgmt. Rev. (Mar. 26, 2018), sloanreview.mit.edu/article/the-risk-of-machine-learning-bias-and-how-to-prevent-it/ (stating risk of building bias in AI algorithms).  See Allen Driverless Finance, supra note 7. 

[20] See Paresh Dave, Social Media Giants Warn of AI Moderation Errors as Coronavirus Empties Offices, Reuters (Mar. 16, 2020), https://www.reuters.com/article/us-health-coronavirus-google-idUSKBN2133BM (demonstrating outcome when AI makes improper correlations). 

[21] See Danielle Keats Citron & Frank Pasquale, The Scored Society:  Due Process for Automated Predictions, 89 Wash. L. Rev. 1, 13-14  (2014) (discussing how AI can hide discrimination and biases can exist in coding) [hereinafter “AI Due Process”]; Chris DeBrusk, “The Risk of Machine-Learning Bias (and How to Prevent It)”, MIT Sloan Mgmt. Review (Mar. 26, 2018), sloanreview.mit.edu/article/the-risk-of-machine-learning-bias-and-how-to-prevent-it/ (stating risk of building bias in AI algorithms).

[22] See AI Due Process, supra note 21 at 19 (defining sovereign responsibility/accountability); AI Proxy Discrimination, supra note 7 at 12, 29 (defining proxy discrimination and how it happens).

[23] Id. 

[24] See CFPB Finalizes Policy to Facilitate Consumer-Friendly Innovation, CONSUMER FIN. POTECTION BUREAU:  NEWSROOM (Feb. 18, 2106), https://www.consumerfinance.gov/about-us/newsroom/cfpb-finalizes-policy-to-facilitate-consumer-friendly-innovation/ (announcing original NAL policy); CFPB Issues Policies to Facilitate Compliance And Promotion Innovation, CONSUMER FIN. POTECTION BUREAU:  NEWSROOM (Sept. 10, 2019),https://www.consumerfinance.gov/about-us/newsroom/bureau-issues-policies-facilitate-compliance-promote-innovation/ (announcing updated NAL policy and final sandbox policy) [hereinafter “CFPB Policy Announcements”]; Upstart NAL Announcement, supra note 4. 

[25] See id.

[26] Bureau of Consumer Financial Protection, Docket No. CFPB-2018-0042, Policy on the Compliance Assistance Sandbox 4 (2019), https://files.consumerfinance.gov/f/documents/cfpb_final-policy-on-cas.pdf [hereinafter “CFPB Sandbox Policy”];  See Allen Sandbox Boundaries, supra note 13; CFPB Policy Announcements, supra note 24. 

[27] CFPB Sandbox Policy, supra note 26.  See Allen Sandbox Boundaries, supra note 13; AI Due Process, supra note 21 at 10 (stating inability to audit AI); CFPB Policy Announcements, supra note 24

[28]  See Allen Regulatory Sandboxes, supra note 7 at 580-81 (providing benefits of regulatory sandboxes, difference from other regulatory exemptions, postulating a broader purpose for sandboxes, and sharing that UK’s sandbox credited with making London the world’s foremost fintech hub). 

[29] See id.

[30] Michael J. Bologna, Fed Official Dismisses ‘Regulatory Sandboxes’ for Fintech, Bloomberg Law: Banking Law News, (Sept. 18, 2017), https://news.bloomberglaw.com/banking-law/fed-official-dismisses-regulatory-sandboxes-for-fintech.

[31] See Allen Regulatory Sandboxes, supra note 7 at 580-81 (providing benefits of regulatory sandboxes, difference from other regulatory exemptions, postulating a broader purpose for sandboxes, and sharing that UK’s sandbox credited with making London the world’s foremost fintech hub).

[32] See id. 

[33] See id.

[34] See id.  See also Allen Sandbox Boundaries, supra note 13; AI Due Process, supra note 21 at 10 (stating inability to audit AI); AI Proxy Discrimination, supra note 7; Balkin, supra note 16.

[35] See id.

[36] See id.

Share this post