Home / News / National / Summit and legislation tackles artificial intelligence, nukes and globalism
summit artificial
An AI robot holds a press conference. The AI stated it could run the world better than humans. Photo: Video grab.

Summit and legislation tackles artificial intelligence, nukes and globalism

The United Nations and Swiss government, concerned by the spread of artificial intelligence, held a summit this week to address how to control and benefit from it.

More than 3,000 experts from companies Amazon, Microsoft, Google joined experts and 40 U.N. agencies to discuss the technology and how it fits in a “globalist framework” according to the event’s description.  The eeting comes as the United States considers legislation to prevent AI from taking over the nation’s nuclear arsenal.

“This technology is moving fast,” said Doreen Bogdan-Martin, head of the International Telecommunication Union (ITU), the U.N.’s information and communications technology agency.

In the United States, concern is uniting both Republican and Democrat lawmakers who proposes legislation in June to prevent AI from taking control of the U.S. nuclear arsenal should AI indepedently determine that an attack is needed.

Rep. Ted Lieu (D-Calif.) has proposed an amendment to the defense policy measure for 2024 that would require the Pentagon to implement a system that ensures that “meaningful human control is required to launch any nuclear weapon.”

The revision calls for mandatory human presence on the final decision for a nuclear attack and the picking of a target, in case AI is involved in nuclear weapons deployment.

Bipartisan support for Lieu’s amendment indicates that legislators are increasingly concerned that AI could make decisions as rapidly as it can evaluate the situation.

“We’ve all seen the sci-fi movies where artificial intelligence takes over and fights wars,” says Republican Congressman Ken Buck, who says we must have human input “probably more than one, to make these decisions.”

READ: AI expert says “everyone will die” if technology not controlled

Buck is the co-sponsor with Lieu of the Autonomous Artificial Intelligence Act along with Democratic Rep. Don Beyer. It’s a bill to ensure a human being would always be in charge of nuclear targeting and any potential launch.

“The bill is important to make sure there are no accidents in the use of nuclear weapons and that there are humans that are making responsible decisions,” Buck said. He and Lieu have worked together before, having entered Congress together in the same class and both serving as their respective party’s freshman class president.

Rep. Stephen Lynch (D-Mass.) also offered an amendment in February that “requires the Secretary of Defense, in carrying out any program, project, or other activity involving the use of artificial intelligence or autonomous technology, to adhere to the best practices set forth in the Political Declaration on Responsible Military Use of Artificial Intelligence and Autonomy issued by the Biden Administration in February 2023.”

The non-binding guidance cited in Mr. Lynch’s amendment states, among other things, that nations should “maintain human control and involvement for all actions critical to informing and executing sovereign decisions concerning nuclear weapons employment.”

“States should design and engineer military AI capabilities so that they possess the ability to detect and avoid unintended consequences and the ability to disengage or deactivate deployed systems that demonstrate unintended behavior,” it reads. “States should also implement other appropriate safeguards to mitigate risks of serious failures.”

According to the Government Accountability Office (GAO), the Department of Defense (DOD) is pursuing advanced AI capabilities. The GAO recommends that AI acquisition guidance should first be established within the DOD, and only after that AI use should be explored.

GAO issued such recommendations to the three military branches, and these recommendations are not binding in nature.

–Metro Voie and wire services

 

 

Leave a Reply

X
X