Industry Self-Regulation: A Path Forward for Governing Artificial Intelligence?

Dec 21, 2023 by Eric D. Reicin, President & CEO, BBB National Programs

I have written previously about the multiplicity of ways that independent industry self-regulation can address some of our most vexing societal problems by encouraging businesses to act responsibly, always with an eye on enhancing consumer trust. Today, I want to discuss how industry self-regulation can be a key differentiator for businesses and nonprofits to perform at their most effective level, especially when it comes to engaging with emerging technologies such as generative artificial intelligence (AI).

Indeed, industry self-regulation, with its speed and agility when adapting to market changes and consumer needs, was designed for these major transformational moments, such as the current rise of AI.

With the recent release of the Biden Administration’s Executive Order (EO) on Safe, Secure, and Trustworthy Artificial Intelligence, it is clear that the management of risks associated with AI is fast becoming a priority, with the development of new guidance and rules likely over the coming years. In addition, the Office of Management and Budget (OMB) released its direction to federal agencies on how to implement the EO. I predict that this directive could potentially provide clues as to how private industry is regulated down the road. 

But the question is, as a private sector leader, what is your road map today?

 

Attributes of Industry Self-Regulation

Through industry self-regulation, businesses can come together to develop an appropriate road map for industry, one that is attuned to market realities and reflects the accumulated judgment and experience of industry players representing a broad cross-section of industry views. 

How do you ensure that such an approach will work? Well-constructed industry self-regulation consists of key attributes: 

  • The ability to tailor work done in other industries to a new business category. Government regulation tends to paint with a broader brush.
  • The opportunity to achieve compliance at levels equal to or greater than government. The threat of referral to the government for noncompliance also is an effective mechanism.
  • A transparent and objective process for rule-setting and accountability mechanisms. This allows the consumer to judge the system’s integrity, increasing public confidence in the process.

 

For example, in her speech at the 2014 Self-Regulation Conference, Commissioner Maureen K. Ohlhausen said: “Successful self-regulatory initiatives share several features: clear requirements; widespread industry participation; active monitoring programs; effective enforcement mechanisms; procedures to resolve conflicts; a transparent process; responsiveness to a changing market and to consumers; and sufficient independence from direct control by industry.” 

 

Considerations for Organizations Today

Despite these successes in industry self-regulation, its promise has yet to be realized across many industries, including in generative AI. That can change, and here is how organizations can begin thinking about industry self-regulation:

  • Do Not Wait: The time is now to work within your industry or as part of a cross-industry incubator effort and take steps towards developing realistic guidelines for AI.
  • Consider Existing Rules: There is no need to start from scratch. Consider what rules from around the world make sense for your industry and use those as models to lead in AI governance. 
  • Be Truthful to Claims of AI Governance: Existing rules on truth-in-advertising apply.

 

A variety of global efforts focused on AI risks and opportunities have coalesced since the dawn of the ChatGPT era. In some other parts of the world – such as the European Union and China – rules and regulations have been put in place to address the perceived risks of rapid AI implementation.

 

Looking Ahead 

Early AI legislative and regulatory efforts in the U.S. are found mostly in the states. According to a report from Software Alliance, “State legislators introduced more AI-related bills–191–this year than in the previous two years combined, a 440% increase in the number of AI-related bills introduced in 2022.” 

A 2021 paper by Michael Cusumano of MIT, Annabelle Gawer of the University of Surrey, and David Yoffie of Harvard Business School entitled “Can Self-Regulation Save Digital Platforms?” delves into the history of industry self-regulation with the goal of sharing learning on if and how self-regulation can thrive in the digital age. While the piece was authored before ChatGPT arrived, it is prescient in its description of the challenges and opportunities that arise when trying to regulate – or self-regulate – emerging technologies.

The authors underscore that self-regulation by an industry group works best when the sector consists of a small, relatively homogeneous number of interconnected or interdependent actors who face shared challenges. They conclude, “. . .the research suggests that a combination of self-regulation and credible threats of government regulation may yield the best results.”

As the LegalTech article (registration required) notes, “. . .some industry experts are pushing back on imposing legally binding obligations on AI. Instead, they see an “all-hands-on-deck” method that also relies on “soft law” principles—think guidelines, self-certification, and industry standards—as the way to holistically regulate the fast-evolving AI realm.” 

Industry self-regulation through agreed-upon voluntary standards and guidelines should be part of the mix from the very start, especially when it comes to AI.

Originally published in Forbes.

Suggested Articles

Blog

Old MacDonald Had an Engagement Farm: Lessons Learned from FTC v. NGL

Capturing user engagement is the foundation of internet commerce. And while the incentives to prompt greater engagement are certainly understandable, the recent NGL Labs case from the FTC raises important questions about the ethical and legal ramifications when companies try to artificially generate engagement among their userbase.
Read more
Blog

Independence Day Edition: CBPR Framework Offers “Checks & Balances”

Going, Going, Gone Global, a webinar on the CBPR Global Forum, delved into how privacy impacts businesses’ brand reputation and builds trust with key stakeholders, discussed the purpose of the Global CBPR, and its value to Global Forum members.
Read more
Blog

Industry Self-Regulation: Part of the Solution for Governing Generative AI

The spotlight on generative AI remains bright. The benefits and risks continue to be ever-present in the minds of business and political leaders. No matter the timing or the setting, the creation of transparency, accountability, and collaboration among stakeholders is key to successful industry self-regulation as is the importance of setting standards and best practices.
Read more
Blog

The Demise of “Chevron Deference”: Who Will Fill the Regulatory Gaps?

The Supreme Court's 1984 ruling in Chevron v. NRDC held that courts should defer to federal agencies’ interpretations of ambiguous federal laws so long as those interpretations are reasonable. So given the court’s decision to overturn it, where does that leave companies that want a level playing field and perhaps even to raise the bar, instead of racing to the bottom?
Read more