High-Risk Enforcement Areas for Financial Services Providers: AI, Lending, and Privacy

Andrea Perez July 30, 2025

Did you know that U.S. federal regulators issued approximately 173 public enforcement actions against financial services providers in 2024? Over 35% of those enforcement actions resulted in some form of monetary penalty, restitution, or forfeiture, ranging from less than $10,000 to as much as $450 million. Although the regulatory landscape is shifting and there may be some uncertainty about pending rules, regulatory agencies continue to carry out their responsibilities of ensuring compliance with existing laws and protecting consumers.   This is a clear reminder that while compliance challenges are constant and ever changing, federal regulators continue to reaffirm their enforcement focus, by issuing 44 enforcement actions from the start of 2025 through May.

Continued regulatory enforcement activity reflects heightened scrutiny of specific risk areas where evolving technologies, shifting customer expectations, and aggressive business practices intersect. Financial institutions that fail to adapt quickly enough may find themselves at the center of these enforcement actions. Among the most common and pressing issues driving recent regulatory action are three high-risk areas that demand focused compliance attention:

  1. Artificial Intelligence (AI) -driven lending discrimination
  2. Predatory small-dollar lending
  3. Data privacy violations in digital financial services

For each high-risk area, we will cover the key concern and present best practices and recommendations for consideration within your financial institution’s compliance programs.

1. AI-Driven Lending Discrimination: Is It Preventable?

AI-driven lending discrimination occurs when automated systems, such as machine learning models used in credit scoring or loan underwriting, make decisions that result in unfair treatment of certain groups, often without clear or intentional bias from the lender. When traditional lenders partner with fintechs, complex and often proprietary algorithms often leave the lender in a vulnerable position because they are unable to detail complete rationale on credit denials.

In a report issued by the United States Government Accountability Office, federal financial regulators emphasized that existing laws and regulations apply to financial services, regardless of whether those services involve AI. Lenders using artificial intelligence or complex algorithms must still comply with the Equal Credit Opportunity Act (ECOA) or Regulation B, which requires lenders to:

  • Avoid discrimination based on race, gender, age, national origin, or other protected characteristics.
  • Provide specific and understandable reasons for adverse actions (e.g., loan denials).

 Notices of adverse action, specifically notices regarding denial of a credit application, must include specific and accurate reasons for the denial, even when decisions are made by complex algorithms that are not easily interpretable by humans. Since 2019, federal financial regulators have performed reviews of several financial institutions focusing on their use of AI. They found that financial institutions often provided limited information regarding their efforts to assess bias and address fair lending concerns related to AI and machine learning.

Key Concern: Consumers may be denied credit based on data points they do not understand or control (e.g., social media behavior, location data), without clear explanations. A 2023 Consumer Financial Protection Bureau (CFPB) study found that over 60% of AI-based credit decisions lacked explainable reasoning when reviewed by compliance teams. Algorithms trained on biased data can amplify existing inequalities, even if the model itself is technically neutral. This can happen because complex lending models can include historical lending data that reflects past discrimination, and proxy variables such as zip codes or education levels that correlate to race or income.

Recommended Practices:

  • Conduct fairness and bias audits on AI models and regularly test models for disparate impact across demographic groups.
  • Use diverse and representative data sets. Maintain documentation of model inputs, logic, and decision pathways.
  • Use explainable AI (XAI) frameworks to ensure transparency.
  • Train staff to interpret and communicate AI-driven decisions clearly; include a manual review for high-risk decisions.
  • Ensure consumers receive clear specific reasons for credit denials.

2. Predatory Small-Dollar Lending

Small-dollar loans are typically short-term loans of less than $1,000, often marketed as payday loans, auto title loans, or installment loans. While these products can provide quick access to cash, and have become a popular credit-building product, predatory versions of these loans trap consumers in cycles of debt. The CFPB continues to monitor small-dollar lending practices, especially those involving:

  • Triple-digit APRs (e.g., 300%+ annual interest).
  • Automatic rollovers or refinancing without affordability checks.
  • Aggressive collection tactics.

Key Concern: These practices target low-income consumers. In 2024, the CFPB received over 12,000 complaints related to small-dollar loans, 35% of these complaints involved unclear repayment terms or unexpected fees. In 2025, the CFPB reaffirmed its focus on protecting vulnerable consumers from exploitative lending practices, especially in underserved communities.

Recommended Practices:

  • Cap APRs and clearly disclose total repayment costs; be aware of state usury laws.
  • Avoid automatic rollovers or require explicit consumer consent.
  • Provide hardship options and transparent repayment plans.

3. Data Privacy in Digital Financial Services

Data privacy in digital financial services refers to the collection, use, sharing, and protection of consumers’ personal and financial data by fintechs, banks, neobanks, digital wallets, and embedded finance platforms. As financial services become more digitized, the volume and sensitivity of data being handled has increased dramatically. With the rise of digital wallets, neobanks, and embedded finance, the CFPB is cracking down on:

  • Unauthorized data sharing.
  • Inadequate consumer consent.
  • Poor data security practices.

Key Concern: These practices can violate the Dodd-Frank Act’s UDAAP provisions and may also conflict with state privacy laws like the California Consumer Privacy Act (CCPA) or Virginia Consumer Data Protection Act (VCDPA). A 2024 CFPB audit found that 1 in 4 fintech apps failed to obtain proper consent before collecting sensitive financial data, and over 9,000 complaints were filed in 2024 related to digital financial services and data misuse.

Recommended Practices:

  • Implement clear, opt-in consent mechanisms.
  • Regularly audit third-party data sharing agreements.
  • Encrypt all sensitive data and conduct penetration testing.
  • Allow users to view, download, and delete their data easily.

The Key Takeaway

Modern financial practices must align with long-standing consumer protection principles. Our Guidepost team regularly works with financial institutions navigating the intersection of innovation and regulatory compliance. From these engagements, we have seen firsthand how enforcement actions often arise from partnerships with fintechs and digital lenders, particularly when oversight of emerging technologies or complex product structures is insufficient. We have helped financial institutions prioritize fairness, enhance transparency, and proactively audit their practices.

Whether your financial institution is just beginning to consider modernizing its product and service offerings or you’ve already taken the leap, keeping up with the pace of modernization, growth, and profitability become overwhelming or difficult to navigate, let our regulatory compliance experts guide you.

Andrea Perez wearing a black jacket and a floral shirt is smiling for the camera

Andrea Perez

Associate Director

Andrea Perez has more than 15 years of experience as a senior regulatory examiner analyzing the effectiveness of compliance programs within state-chartered financial institutions. She is highly proficient in state and federal compliance banking regulations; collaborating with federal agencies; and providing detailed written analysis of compliance-oriented evaluations. Ms. Perez is well-versed in lending regulations including the Homeowner’s Protection Act, Truth-in-Lending Act, and the Community Reinvestment Act; as well as deposit and retail regulations; privacy laws; and Dodd-Frank.

SBC hotline
Oakland County AAR
MAGELLAN Monitorship