The SEC’s Initial Proposal to Regulate AI in Financial Markets

0

On July 26th, 2023 the U.S. Securities and Exchange Commission (SEC) proposed an initial set of regulations aimed at addressing the potential conflicts of interest that could arise from companies’ use of predictive data analytics, artificial intelligence and other related technologies (PDA). 

The SEC’s Recent Regulatory Proposal

Recognizing the need to balance innovation and oversight, on July 26th, 2023 the U.S. Securities and Exchange Commission (SEC) proposed an initial set of regulations aimed at addressing the potential conflicts of interest that could arise from financial services companies’ use of predictive data analytics, artificial intelligence and related technologies (collectively referred to as PDA). 

The SEC’s proposed regulations seek to address the challenges posed by AI while fostering innovation and maintaining market integrity.  If approved, the rules apply to any broker-dealer or investment adviser registered under section 203 of the Investment Advisers Act of 1940, including their associated personnel, who are currently using or potentially will use advanced technology in their interactions with investors. 

The proposed rules would require that broker-dealers and investment advisers eliminate or neutralize the effect of any conflict of interest inherent in any PDA they use that places the interest of the respective firm ahead of the interest of investors by doing the following: 

  • Written Policies and Procedures – The proposed rules would require firms to document compliance by adopting, implementing, and maintaining written policies and procedures that evaluate the use or potential use of PDA and other covered technologies to identify whether they involve conflicts of interest where the firm is putting its interests ahead of investor and, if so, taking steps to eliminate or neutralize the effect of those conflicts of interest.
  • Eliminate or Neutralize Conflicts of Interest – While the SEC noted that the process to achieve compliance with the rule is risk-based and not a one-size-fits-all approach, it did propose that every firms’ compliance programs must have written descriptions of the process by which they (i) evaluate the use or potential use of a covered technology in any investor interaction, (ii) identify whether any conflicts of interest that place the firm’s interest ahead of investors exist, and (iii) take steps to eliminate or neutralize the effect of any such conflicts of interest.  In addition, all firms must review these components at least annually to ensure their adequacy. 
  • Maintain Books and Records Documenting Compliance – According to the proposed rules, firms must also maintain records of their evaluations, including (i) a list of all covered technologies used in investor interactions and when each was first implemented and materially modified, (ii) the date of any testing of covered technology as well as any actual or potential conflicts of interest identified therein and any resulting modifications or restrictions.

The SEC defines Predictive Data Analytics as an analytical, technological or computational function, algorithm model, correlation matrix, or similar method or process that optimizes for, predicts, guides, forecasts, or directs investment-related behaviors or outcomes.  Artificial Intelligence is swept into this definition.  The SEC specified that, while this proposed definition is technology-agnostic, it is designed to capture many PDA-like technologies currently in existence, such as AI, machine learning, deep-learning algorithms, neural networks and natural language processing and large language models (including generative pre-trained transformers), whether developed at a firm or licensed from third parties.

The proposed rules are currently out for a public comment period, which will end on October 10, 2023, so as to allow interested parties and the general public time to address the areas where comment was specifically requested as well as make suggestions or raise other concerns or considerations. Following the public comment period, the SEC will reevaluate the proposed rules prior to adopting any final rules.

Disagreements within the SEC

The Commission voted 3-2, along party lines, with the two Republican commissioners (Hester Peirce and Mark Uyeda) criticizing the proposal, saying the new rule was unnecessary given already existing requirements, unduly burdensome on companies, and could offer hackers a roadmap to their targets’ vulnerabilities.  These commissioners also claimed the proposal was unnecessary in light of brokerages’ disclosure requirements and could stifle the use of new technologies.

Commissioner Mark Uyeda, called the proposed rules related to AI to be “breathtakingly broad” and “wholly unnecessary.” The proposal’s “regulatory vagueness and considerable compliance challenges” may discourage innovation on Wall Street if it is approved, he said.

Trade Groups Calling for Extension of Comment Period

In response to this proposed rule, a coalition of 16 industry trade groups is calling on the SEC to extend the comment period on the commission’s new artificial intelligence proposal, citing the proposal’s potential unintended consequences and far-reaching implications.

In a joint letter written by the trade groups and sent to the SEC last week, they explain their issues saying, “We are concerned that the proposal demonstrates a fundamental misunderstanding of how firms rely upon technology in a myriad of ways to benefit investors, both directly (e.g., by amplifying reporting speed and capabilities) and indirectly (e.g., by allowing investment advisers and broker-dealers to enjoy efficiencies and thereby reduce costs).  An extension of the comment period is essential to allow the public a reasonable period of time to assess the widespread coverage and implications of the proposal for the markets and investors, as well as the many unintended negative consequences the proposal may have.”

Our Take

While the SEC’s recent proposal is a proactive step to address one of the challenges that AI could pose for the financial industry, implementing these proposals may require significant resources, potentially limiting the ability of smaller market players to comply.  Consequently, these regulations could actually stifle innovation and the use of technology – an outcome that could actually harm investors.

In addition, the rapid pace of technological advancement raises questions about the adaptability of a static regulatory framework. As AI systems evolve, regulations must remain flexible enough to accommodate new developments without stifling innovation or creating undue burdens.

This doesn’t mean we are against the development of regulations on the use of AI by financial services firms that will promote transparency, accountability, and fairness.  Our biggest concern about the current proposal is that it is extremely broad, and in some ways redundant.  In addition, we don’t see a proposal that is adaptable to the dynamic nature of AI technology. Collaboration between regulatory bodies, financial institutions, and AI experts will be essential in shaping a regulatory framework that safeguards market stability while harnessing the transformative potential of AI.

In the end, the journey towards effective AI regulation in the financial sector is a complex one, requiring careful consideration of the ever-evolving interplay between technology, ethics, and regulatory oversight.  In our view, the SEC’s first step to addressing the various regulatory concerns surrounding the use of artificial intelligence in the financial services market seems to have missed the mark.

Share.

About Author

Mike Mayhew is one of the leading experts on the investment research industry. In addition to founding Integrity Research, Mike is on the board of directors of Investorside Research Association, the non-profit trade association for the independent research industry, and a frequent speaker on research industry trends and developments. Mike has over thirty years of research industry experience. Email: Michael.Mayhew@integrity-research.com

Leave A Reply