Biden’s AI robocalls were made by a Democratic strategist: ‘With just a $500 investment, anyone could replicate them’

In a shocking revelation, the veteran Democratic strategist, Steve Kramerconfessed to orchestrating fraudulent robocalls impersonating the President Joe Bidentargeting New Hampshire voters last month.

What happened: Kramer admitted to sending an automated call to approximately 5,000 potential Democratic voters on the night of January 20, just two days before the New Hampshire primary. The call featured an AI-generated imitation of President Biden’s voice, produced using readily available online technology, The Hill reported.

According to an NBC News report last Friday, Kramer hired a magician Paolo Carpenter to create robocalls using AI technology. NBC News reportedly referenced text messages, call logs and Venmo transactions.

Kramer, in a statement, underlined the urgency of more stringent regulations to prevent similar events.

“With an investment of just $500, anyone could replicate my intentional decision,” Kramer said in the statement, urging immediate intervention from all regulatory authorities and platforms.

Kramer denied that he was instructed to make the robocalls by his then-client, Rep. Dean Phillips (D-Minn.), who is challenging Biden for the Democratic nomination. The Phillips campaign disavowed Kramer’s alleged involvement, saying his actions were unrelated to their campaign.

Carpenter, the magician allegedly employed by Kramer, confirmed to NBC that he created the call but did not distribute it. He stated that he was paid to carry out a task and that he had no bad intentions.

The New Hampshire Attorney General’s Office is currently investigating the robocalls. THE Federal Communications Commission banned the use of AI-generated voices in robocalls weeks after Biden made robocalls.

See also: Xbox’s Phil Spencer announces day one release of all Activision games on Xbox Game Pass: Is Call Of Duty included?

Because matter: The robocall incident has sparked renewed concerns about the potential misuse of artificial intelligence in spreading election misinformation.

The New Hampshire attorney general’s office confirmed the existence of a robocall that imitated President Biden’s voice, advising voters to abstain from the primary election. According to a previous report, this incident was marked as an illegal attempt to disrupt the primaries.

There is growing concern about AI-generated deepfakes threatening the integrity of US elections. The White House has expressed alarm over the circulation of false images and voice alterations of public figures, including President Biden.

Photo by Trevor Bexon on Shutterstock

Read next: Elon Musk criticizes senior director of Gemini Experiences: ‘This madness is a big part of why Google’s AI is so racist and sexist’


Designed by Benzinga NeuroBy
Shivdeep Dhaliwal


The GPT-4-based Benzinga Neuro content generation system leverages Benzinga’s extensive ecosystem, including native data, APIs, and more to create rich, timely stories for you. Learn more.


Source link

Leave a Reply

Your email address will not be published. Required fields are marked *