: A coalition of state attorneys general is warning that an executive order signed by President Biden last year on artificial intelligence could be used by the federal government to ‘centralize’ government control over the emerging technology and that that control could be used for political purposes — including censoring alleged ‘disinformation.’
‘The Executive Order seeks—without Congressional authorization—to centralize governmental control over an emerging technology being developed by the private sector. In doing so, the Executive Order opens the door to using the federal government’s control over AI for political ends, such as censoring responses in the name of combatting ‘disinformation,’’ the coalition of 20 attorneys general, led by Utah AG Sean Reyes, said in a letter to Commerce Secretary Gina Raimondo.
Biden signed the order in October, which established new standards for AI safety and included moves to protect privacy and protect workers and consumers. Specifically, it requires developers to share safety test results and other information with the government.
‘In accordance with the Defense Production Act, the Order will require that companies developing any foundation model that poses a serious risk to national security, national economic security, or national public health and safety must notify the federal government when training the model, and must share the results of all red-team safety tests,’ the White House said at the time. ‘These measures will ensure AI systems are safe, secure, and trustworthy before companies make them public.’
The White House also said the order is aimed at protecting Americans from AI-enabled fraud by establishing standards and best practices to differentiate between AI-generated and authentic content.
However, in a letter to the Commerce Dept. responding to a request for information from the National Institute of Standards and Technology, the AGs say the order created a ‘gatekeeping function’ for the Commerce Department to supervise AI development and forces developers to submit to an ‘opaque and undemocratic process.’
‘We are further concerned that the Executive Order’s bureaucratic and nebulous supervisory process will discourage AI development, further entrench large tech incumbents, and do little to protect citizens,’ they say.
They also accuse the executive of creating a ‘governmental black box’ by failing to disclose how the federal government will use the information provided.
2.2 State AG response on AI executive order by Fox News on Scribd
‘The reporting requirements appear to be merely a pretext for ensuring that the federal government can find out who is developing AI models, supervise that process, and exert pressure to bend those AI models to the administration’s liking,’ they say.
They also warn that the order will inject ‘partisan purposes’ into decision-making, including by forcing designers to prove it can tackle ‘disinformation.’
‘NIST should not use its assignment under the Executive Order to push a partisan agenda of censorship,’ they say.
The attorneys general also say that the authority in the Defense Production Act contains no authority to regulate development, only to encourage the production of it, meaning the executive does not have the authority to regulate this technology.
The officials tell Raimondo that the issues relating to AI are ‘complex and important, but they must be addressed by our constitutional, democratic process, not by executive fiat.’ The Commerce Dept. did not respond to a request for comment from Fox News Digital.
‘While there is serious debate as to the best approach to regulate AI, one thing is clear—-the Biden administration cannot simply bypass congressional authority to act here,’ Reyes told Fox News Digital. ‘Any regulation must comport with the Constitution including only authorized executive action, as well as protecting against government censorship. As the administration proceeds to implement the White House AI Executive Order, we will remain vigilant on upholding the rule of law.’
Fox News’ Greg Norman contributed to this report.