CFPB Cracks Down on Chatbots

The CFPB recently issued an “issue spotlight” to address concerns with the use of “chatbots”. It has apparently received “numerous” complaints from individuals who have tried to interact with this form of artificial intelligence when looking for answers from their financial institutions. Some chatbots have names; some try to respond using machine learning/technology; and some dictate how a user may proceed by limiting the user’s options. Recently, even more advanced “generative chatbots”, have been used for customer service needs. The CFPB notes:

Financial institutions are increasingly using chatbots as a cost-effective alternative to human customer service.

Chatbots may be useful for resolving basic inquiries, but their effectiveness wanes as problems become more complex.

Financial institutions risk violating legal obligations, eroding customer trust, and causing consumer harm when deploying chatbot technology.

The CFPB is concerned that chatbots are ineffective and unable to provide meaningful assistance, especially as questions get more complex. The CFPB does not believe institutions should use a chatbot …as their primary customer service delivery channel when it is reasonably clear that the chatbot is unable to meet customer needs. The point being, even chatbots must be in compliance. Here are some specific concerns noted by the CFPB:

 

1. Limited ability to solve complex problems
  • Difficulties in recognizing and resolving peoples’ disputes.
  • Providing inaccurate, unreliable, or insufficient information
  • Failure to provide meaningful customer assistance

2. Hindering access to timely human intervention


3. Technical limitations and associated security risks

  • System reliability and downtime
  • Security risks posed by impersonation and phishing scams
  • Keeping personally identifiable information safe
4. Risks associated with the integration of deficient chatbots
  • Risk of noncompliance with federal consumer financial law
  • Risk of diminished customer service and trust when chatbots reduce access to individualized human support agents
  • Risk of harming people

 

Ultimately, institutions using chatbots in any form need to understand the limitations involved with their use and the reality that the institution may be responsible for any resulting harm.

Consulting Resources!

Published
2023/06/30

 

Diane Dean

Diane joined Banker’s Compliance Consulting with over 10 years of compliance experience and over 15 years of experience within the financial industry. Diane is a Certified Regulatory Compliance Manager (CRCM) and has a Bachelor’s Degree in Sociology with a concentration in Criminal Justice. She is a graduate of the Schools of Banking Compliance School and has participated in various other training opportunities throughout her career. Diane understands firsthand the struggles banks face in building and maintaining successful compliance programs. Her experience and common sense approach to consumer compliance is a great asset to our clients. Diane and her husband have two kids who keep them busy. She enjoys running and other sports and is a big Bugs Bunny fan! She’s a bit crazy in that she does enjoy reading some of these regulations and she’s a “crazy cat lady!” Her cat tales are hilarious!

Recent Posts

Section 1071: Number of Principal Owners

Flood: Relying on A Prior Flood Determination

Knowing Your Customer When Banking Marijuana & Hemp