Joe Biden Wants US Government Algorithms Tested for Potential Harm Against Citizens

Khari Johnson writes for WIRED about the new draft rules from the White House, which would require federal agencies to assess AI systems currently in use in law enforcement, health care, and other areas—and to shut down any algorithms doing harm.

The White House issued draft rules today that would require federal agencies to evaluate and constantly monitor algorithms used in health care, law enforcement, and housing for potential discrimination or other harmful effects on human rights.

Once in effect, the rules could force changes in US government activity dependent on AI, such as the FBI’s use of face recognition technology, which has been criticized for not taking steps called for by Congress to protect civil liberties. The new rules would require government agencies to assess existing algorithms by August 2024 and stop using any that don’t comply.

“If the benefits do not meaningfully outweigh the risks, agencies should not use the AI,” the memo says. But the draft memo carves out an exemption for models that deal with national security and allows agencies to effectively issue themselves waivers if ending use of an AI model “would create an unacceptable impediment to critical agency operations.”

The draft rules were released by the White House Office of Management and Budget two days after President Biden signed an executive order that amounted to a government-wide plan to simultaneously increase government use of AI while also seeking to prevent harm from the technology. The need to keep people safe from AI was a major theme, with the order’s provisions including reporting requirements for the developers of large AI models and compute clusters.

The proposed OMB rules would add testing and independent evaluation of algorithms bought from private companies as a requirement of federal contracts, which the office can do in its role of coordinating departments with presidential priorities. They would ask government agencies to evaluate and monitor both algorithms in use and any acquired in the future for negative impacts on privacy, democracy, market concentration, and access to government services.

Please click on this link to read the full article.

Image credit: Photo by Caleb Perez on Unsplash

Your account