Regulatory Update

ChatGPT and Financial Services Compliance: Top 10 Questions

by Robert Cruz

Subscribe to the Smarsh Blog Digest

Subscribe to receive a monthly digest of articles exploring regulatory updates, news, trends and best practices in electronic communications capture and archiving.

Smarsh handles information you submit to Smarsh in accordance with its Privacy Policy. By clicking "submit", you consent to Smarsh processing your information and storing it in accordance with the Privacy Policy and agree to receive communications from Smarsh and its third-party partners regarding products and services that may be of interest to you. You may withdraw your consent at any time by emailing privacy@smarsh.com.

As ChatGPT continues its ascent as one of the fastest adopted technologies in history, more questions than answers are rising about if and how it can be used by businesses — including firms in highly regulated industries such as financial services.

For many, it's yet another entry on the growing list of unproperly vetted technologies, which place it into the category of prohibited tools until policies and controls can be investigated and implemented.

However, what is now arising is a list of Frequently Asked Questions regarding the potential use of ChatGPT by regulated businesses. So, here’s our list of questions that firms should ask, none of which was either authored, edited, or inspired by the OpenAI algorithm itself.

1. What is ChatGPT and how does it work?

ChatGPT, owned by the firm OpenAI and licensed by global firms including Microsoft, is an artificial intelligence engine that responds to prompts based on a probabilistic analysis of words and phrases that tend to surround it in a given context. For purposes here, we will focus on its ability to generate unstructured content versus writing code or other uses. While we focus on ChatGPT, this list also applies to different variants, including Bard (Google), Einstein GPT (Salesforce), and LLaMa (Meta).

2. Is the output from ChatGPT accessible to the external market?

From a regulatory perspective, market-facing generative AI content is no different from any other source of information. Firms must meet regulatory obligations regarding recordkeeping, storage and oversight if ChatGPT-generated content pertains to your business. Firms also need to carefully consider the fact that these tools “may occasionally generate incorrect information.” That lack of quality should immediately raise warning flags about the completeness and accuracy of information fed into the model, along with the possibility of PII or data subject to data privacy, information bias or restricted material.

3. What specific regulations apply to those providing ChatGPT generated content?

Regulatory obligations differ for RIAs, broker-dealers and other financial services entities. So, leveraging ChatGPT for market-facing engagement will need to pay particular attention to compliance obligations specific to them, such as Regulation Best Interest (Reg Bi), the Marketing Rule, new cybersecurity requirements, and more.

4. Can ChatGPT output used internally be considered as a business record?

For example, if an employee researches developing a new financial product using ChatGPT as one of the sources. As always, content and context are determinative in assessing whether information represents value or risk to the business and is, therefore, subject to its internal retention policy or falling within the recordkeeping requirements of FINRA 4510 or SEC 17a-4 requiring retention. At a minimum, firms should consider ChatGPT output as they do for any other information or resource that is incorporated into a customer service, product planning or other decision support application.

5. Are firms more likely to adopt a public-accessible or a closed/private version of ChatGPT?

As with any machine-aided content delivery method, the better you understand the data sources fed into a model, the easier it will be to explain to regulators if an issue arises with the output. Given the highly regulated nature of financial services, as well as the privacy, security, and potential ethical considerations raised above, it appears much more likely that firms will embrace closed/private variants where they can further tune models with familiar data sources that are owned or controlled by the firm.

6. What methods are currently available to capture ChatGPT content?

Limited automated approaches are available in the market to proactively capture and preserve historical records of ChatGPT-generated content made available by OpenAI, Microsoft, Google, Salesforce, or Meta. The speed at which that changes will be driven by if and how those providers have prioritized the use of ChatGPT capabilities by regulated businesses.

7. How could ChatGPT impact supervisory review and conduct oversight?

Once the challenge of capturing ‘complete and accurate’ historical records of ChatGPT-generated content has been addressed, ChatGPT will remain challenged on explainability, given the lack of explicit regulatory guidance. As a result, ChatGPT will have to improve accuracy and the lack of post-2021 data. This is in stark contrast to machine language systems that were designed for complex regulatory environments and have been researched and designed from the ground up for quality, consistency and explainability — all of which financial services firms require.

8. How might its use impact litigation and e-discovery?

We’ve only seen the first potential case arise where ChatGPT wrongly associated a person with criminal activity, leading to potential litigation against OpenAI. More will follow, most likely around cases of inaccuracy. However, if ChatGPT is used to produce content that is harmful to someone else, either side could produce that content to support their claim. As with regulatory disputes, it is highly unlikely that a court would be favorable toward a party that lacked the ability to discover or produce a historical record if it was material to that matter.

9. What additional regulation can we expect to govern the use of ChatGPT?

There are growing calls to pause AI development until we better understand its implications. As a result, action by regulators is increasingly likely — and welcomed by more — to build safeguards against its unethical use and prevent those with the intent of wrongdoing from leveraging the technology. This includes the 2021 AI Act by the European Union, the recently proposed AI Bill of Rights drafted by the Biden Administration, as well as additional guidance expected by both the Securities and Exchange Commission (SEC) and FINRA around supervisory controls, data governance, model risk management, and data privacy protection.

10. What do we expect Financial Services firms to do first with ChatGPT?

The first move made in financial services for any new communications technology is prohibition until risk and security assessments can be completed. However, generative AI is moving faster than regulation, faster than our understanding of the ethical, privacy, and security implications, and faster than policy enforcers can effectively stop employees who are already using it.

Given the scrutiny firms are faced with regarding the use of prohibited communications tools, expect firms to continue to be hyper-focused on looking for lowest risk applications where ChatGPT can be approved for business use, such as enabling more effective search and retrieval of information or other internal decision support use cases with closed/private AI models.

Share this post!

Robert Cruz
Smarsh Blog

Our internal subject matter experts and our network of external industry experts are featured with insights into the technology and industry trends that affect your electronic communications compliance initiatives. Sign up to benefit from their deep understanding, tips and best practices regarding how your company can manage compliance risk while unlocking the business value of your communications data.

Ready to enable compliant productivity?

Join the 6,500+ customers using Smarsh to drive their business forward.

Get a Quote

Tell us about yourself, and we’ll be in touch right away.

Smarsh handles information you submit to Smarsh in accordance with its Privacy Policy. By clicking "submit", you consent to Smarsh processing your information and storing it in accordance with the Privacy Policy and agree to receive communications from Smarsh and its third-party partners regarding products and services that may be of interest to you. You may withdraw your consent at any time by emailing privacy@smarsh.com.

Contact Us

Tell us about yourself, and we’ll be in touch right away.