Sign In  |  Register  |  About Pleasanton  |  Contact Us

Pleasanton, CA
September 01, 2020 1:32pm
7-Day Forecast | Traffic
  • Search Hotels in Pleasanton

  • CHECK-IN:
  • CHECK-OUT:
  • ROOMS:

This dominant force can tame AI better than politicians

Artificial intelligence can be controlled by market forces far better than regulation because that’s the primary way companies and technology are held accountable.

New generative Artificial Intelligence (AI) systems have captivated the world's imagination with promise and potential. AI’s ability to analyze vast amounts of data and make autonomous decisions is a source of both awe and anxiety. People worry about bias in decision-making, the invasion of privacy, job displacement, and even the existential fear of machines becoming uncontrollable. How can we make sure AI benefits society? 

The National Telecommunications and Information Administration (NTIA) has responded by seeking input on how to ensure that AI companies are "accountable." It asks, "how to develop a productive AI accountability ecosystem." NTIA defines "AI" broadly enough to sweep in most significant software systems, including many that pre-date recent AI developments.

And yet, as we at the Center for Growth and Opportunity point out in our just-filed comments, NTIA’s broad sweep misses the primary way companies and technology are held accountable. 

MINORITY GROUPS SOUND ALARM ON AI, URGE FEDS TO PROTECT ‘EQUITY AND CIVIL RIGHTS’

AI companies, like those in every industry, are held accountable first and foremost by their customers acting within a competitive market system.

Business and consumer markets, reputational and financial markets, generally applicable laws, and societal norms – all create feedback loops that align the interests of producers with stakeholders. Competition for profit drives this alignment, sparking companies to innovate, develop valuable products and services, and thus benefit society. 

Of course, this system isn't perfect. Sometimes, market feedback mechanisms fail. When a producer's actions affect third parties or information distribution is asymmetrical, we may need alternative mechanisms. But these alternatives should be the exception, not the rule. They need to support, not replace, market accountability mechanisms. 

NTIA’s request doesn't acknowledge market-based accountability or identify gaps. Yet many of its proposed AI accountability mechanisms, such as transparency, certifications and third-party audits can and already do function within the market, both in AI and in other areas.

Consider, for example, products that certify the integrity of their supply chain with seals, or the "UL" certification marks on a wide variety of electrical home devices, or Yelp ratings of local services and restaurants.

In AI, one recent example is the Center for Industry Self-Regulation’s recently released Principles for Trustworthy AI in Recruiting and Hiring and the Independent Certification Protocols for AI-Enabled Hiring and Recruiting Technologies.

These were created to establish a global baseline standard for the use of AI applications in recruitment and hiring and to create a pathway to independent certification for such technology. Market-based approaches like these are vital to a thriving AI accountability ecosystem. 

It's important to remember that the NTIA, while influential in convening discussions, is not a regulatory body and lacks the power to impose binding rules. Therefore, the agency cannot afford to ignore market-based accountability, given its limited authority and the inherent complexity of AI. 

CLICK HERE TO GET THE OPINION NEWSLETTER

AI accountability is not just about regulation. It's about engaging with our existing market system, identifying gaps and cautiously implementing exceptions when necessary. It's about preserving the market's ability to self-correct and innovate.

Indeed, markets can help society weigh the inevitable trade-offs between different accountability goals, such as privacy and transparency, or accuracy and access.

NTIA’s discussion of AI accountability should be more than an exercise in identifying how governments might regulate companies or how companies might regulate themselves. It should also move beyond identifying hypothetical AI issues that might need regulation sometime in the future.

Instead, NTIA should take a holistic view, recognizing that accountability comes first from markets. It should focus on how to enhance these market-based accountability ecosystems in the era of AIs rather than attempting to replace or undermine them.

If NTIA seeks to strengthen our oldest and strongest accountability mechanisms – those provided by markets – then it will be better able to address any potential harms while preserving the benefits of AI for society.

Data & News supplied by www.cloudquote.io
Stock quotes supplied by Barchart
Quotes delayed at least 20 minutes.
By accessing this page, you agree to the following
Privacy Policy and Terms and Conditions.
 
 
Photography by Christophe Tomatis
Copyright © 2010-2020 Pleasanton.com & California Media Partners, LLC. All rights reserved.