It’s time for structured oversight of AI in healthcare

How AI and other new technologies present novel compliance and enforcement risks and opportunities to the pharmaceutical industry.

The rapid development and growing accessibility of artificial intelligence (AI), and the expanding use of AI in the life sciences sector, are becoming increasingly important factors in the continuing evolution of pharmaceutical industry compliance, and government enforcement efforts.

But the use of AI comes with unanswered questions over its incorporation into both business operations and corporate compliance programs. If they haven’t already, pharmaceutical manufacturers would be wise to begin tackling oversight of AI and consider the limited guidance on the use of AI in compliance programs from authorities like the US Department of Justice (DOJ).

Given the continued claims from enforcement agencies like DOJ and the Department of Health and Human Services Office of Inspector General (HHS OIG) about investment of resources in their own data-driven investigative techniques, pharmaceutical industry companies risk falling behind the government if they don’t also take initial steps to improve the effectiveness of their compliance programs by including AI.

DOJ prosecutorial guidance

In September 2024, the DOJ Criminal Division issued an update to its Evaluation of Corporate Compliance Programs (ECCP) document, which is the Criminal Division’s guidance for federal prosecutors as to the factors they should use to evaluate the effectiveness of corporate compliance programs. Although the ECCP is designed to inform the federal government’s potential charging decisions related to and/or resolution of criminal cases, the questions DOJ asks prosecutors to consider address underlying compliance principles from which corporations design, implement, and evaluate their compliance programs.

Guidance issued by enforcement agencies typically lags new technologies, but DOJ appears to be trying to get out in front of – or at least keep pace with – the emergence of AI. The September 2024 update to the ECCP focuses extensively on new technologies in general and AI in particular, not only with respect to how AI is deployed in a company’s business operations, but also how AI is incorporated into a company’s compliance program to make the program more effective.

The ECCP defines AI broadly and states that “no system should be considered too simple to qualify as a covered AI system due to lack of technical complexity”, including but not limited to machine learning and generative AI systems that operate with or without human oversight.

Expectations on the use of AI in business operations  

In determining whether a compliance program is appropriately designed to detect and prevent the type of misconduct most likely to occur at a particular company, the ECCP historically has directed prosecutors to consider traditional risk factors such as the industry sector in which the company operates, the regulatory landscape related to that industry sector, the competitiveness of the market, and a company’s potential clients and business partners.

With the September 2024 update, however, the ECCP now instructs prosecutors to also consider AI, and other new and emerging technologies that a company and its employees use to conduct company business; whether the company has conducted a risk assessment with respect to the use of that technology; and whether the company has taken appropriate steps to mitigate any corresponding risk to ensure compliance with its own code of conduct and all applicable laws, specifically including any impact on the company’s ability to comply with criminal laws.

When prosecutors assess whether a company has implemented adequate controls around the use of AI in business operations, the ECCP suggests that prosecutors ask, among other technology-related questions;

  • whether the management of risks related to use of AI and other new technologies is integrated into broader enterprise risk management strategies;
  • what the company’s approach is to governance regarding the use of new technologies such as AI in its commercial business;
  • whether controls exist to ensure that AI is used only for its intended purposes;
  • how the company curbs potential negative or unintended consequences resulting from the use of AI; and
  • how the company trains its employees on the use of AI and other emerging technologies.

It’s likely that few pharmaceutical industry compliance programs have attempted to identify whether and how AI is being used in a company’s business operations, let alone to assess whether it has answers to the questions posed by the ECCP. It would be prudent for pharmaceutical industry compliance professionals nevertheless to use the sections of the ECCP related to emerging technologies to guide compliance program development and operations.

The DOJ is not likely to look favorably on companies that have barreled ahead with the use of AI or other new technologies without considering compliance risks and implementing commensurate controls, particularly if the technology is the source of, contributes to, or facilitates fraud. And despite the framework established by the ECCP for evaluating corporate compliance programs, it remains to be seen how government enforcement and regulatory agencies actually will assess how a company manages risk related to ethical use of AI and other new technologies. 

It will be necessary, therefore, for companies to be inventive in how they identify, assess, and mitigate such risks, at least in the short term.

Expectations on the use of AI in corporate compliance programs

Separately, the ECCP clearly communicates an expectation that compliance programs be designed and operated with AI in mind. The ECCP suggests that prosecutors assess whether a company is using new technologies such as AI in its compliance program, whether the compliance program is monitoring such technologies used by the business to evaluate whether they are functioning in a manner consistent with the company’s code of conduct, and the speed with which the company can detect and correct decisions made by AI that are inconsistent with the company’s values.

Again, it’s likely that few pharmaceutical industry compliance programs have thought about these issues, let alone begun to incorporate AI into the operations of the compliance program itself.

The ECCP underscores that stakeholders in the pharmaceutical industry have an opportunity to assess, design, and improve their oversight of business use of AI and other technologies while recognizing the uncertainty and challenges such technologies present. Effective monitoring and auditing is one of HHS OIG’s seven elements of an effective compliance program. Companies in the pharmaceutical industry will need to address how oversight of AI technologies, something that HHS OIG has not specifically addressed to date with respect to compliance program guidance, fits into their broader auditing and monitoring functions.

Stakeholders in the pharmaceutical industry have an opportunity to assess, design, and improve their oversight of business use of AI and other technologies while recognizing the uncertainty and challenges such technologies present.

Compliance programs will need to determine how AI technologies can be used to enhance compliance auditing and monitoring, how the compliance program effectively governs and monitors its own use of AI, and what risks are presented by a double-layered approach of AI-assisted monitoring of AI-assisted business operations.

The ECCP also instructs prosecutors to assess how other traditional elements of an effective compliance program have been adapted to new and emerging technologies, such as updating polices and procedures to address risks associated with the use of new technologies; ensuring compliance personnel have appropriate experience and qualifications appropriate for AI and other new technologies; and whether a compliance program has appropriate funding, resources, and access, including whether compliance personnel have knowledge of and the means by which to access all relevant data sources in a timely manner.

Data-driven enforcement

Pharmaceutical manufacturers risk falling behind law enforcement and regulators if they delay investing in these compliance opportunities. The industry has for years heard representatives from DOJ and HHS OIG threaten increased scrutiny and analysis of the wealth of data available to the government, especially from government healthcare programs like Medicare and Medicaid, to identify potential fraud and abuse.

There are some indicators that this increased focus on data is being used to at least initiate enforcement actions under laws like the False Claims Act (FCA). It’s unclear whether this emphasis on “home-grown” FCA investigations based on data analytics will be a continued focus under the new US administration, but the tools and infrastructure will be available to and likely will continue to be used by DOJ and HHS OIG.

The most recent FCA recovery statistics, however, show a continued rise in FCA whistleblower complaints, which may similarly be fueled by private parties’ use of AI tools such as large-language models to analyze large sets of publicly available data.

Data touted as major contributor to combatting pandemic-related fraud

Although not directed at the pharmaceutical industry in particular, recent investigations and enforcement actions involving COVID-19 programs such as the Paycheck Protection Program (PPP) provide examples of the government’s expanding use of data analytics to combat fraud.

The Pandemic Response Accountability Committee’s (PRAC) Pandemic Analytics Center of Excellence (PACE), an analytics hub of data scientists and investigative analysts tasked to identify potential fraud in data associated with pandemic relief programs, drew praise and bi-partisan support from the Biden Administration and members of Congress. PACE marshaled vast amounts of resources and data through 47 Memorandums of Understanding with corresponding Office of Inspectors General and law enforcement agencies to support over 700 pandemic-related investigations.

In 2024, the Biden Administration and Congressional leaders expressed support for continuing and even expanding the PRAC and the next generation of PACE to apply to all federal spending, and although it’s unclear at this point whether this effort will continue under the new US administration, it certainly is consistent with the administration’s many public announcements about eradicating fraud, waste, and abuse in government programs.

It’s unlikely that regulators and law enforcement are not going to continue using the advancements and vast resources available to the PACE, even as the country moves farther away from the COVID-19 pandemic. Indeed, DOJ representatives have as recently as last month touted “aggressive” continuing enforcement efforts under the FCA.

Key takeaways

Stakeholders in the pharmaceutical industry will have to navigate the incorporation of AI and related technologies into their businesses, including as part of their compliance programs. That may not be a simple endeavor, as the pharmaceutical industry will need to consider the regulatory implications of such technologies across regulated areas, and not just with respect to the ECCP. For instance, the industry will need to consider the FDA regulatory status and privacy implications of any integrated technologies. 

Yet, absent more definitive and formal direction directly to the industry from federal agencies, pharmaceutical companies should consider the questions presented in the ECCP for use by federal prosecutors and any other preliminary guidance pronouncements in carefully evaluating and developing internal controls governing the use of AI and other new technologies in their business operations and in deploying AI to enhance the effectiveness of compliance program operations.

To do otherwise could have serious consequences. Because data sharing and analytics across government agencies appear to be permanent tools that will be used by enforcement agencies such as DOJ and HHS OIG, the pharmaceutical industry would be well advised to begin mitigating the risk by using similar AI and analytics-based strategies in their own compliance programs to identify similar indicators of fraud and preempt any potential investigations and/or enforcement actions.

Written by Morgan Lewis partner Scott A Memmott, who represents life science and healthcare organizations in government and internal corporate investigations, and associate Jonathan P York, who represents US corporate clients government and internal investigations and complex litigation.

The views expressed are those of the authors. New leadership at US agencies (such as DOJ and HHS) mean vastly different priorities at them might currently apply that do not align directly with the EECP when it was published.