Marco Rubio Impostor Uses AI To Deceive High-Ranking Officials

An impostor is using an artificial intelligence voice changer to impersonate Secretary of State Marco Rubio and contact foreign and domestic officials, according to a July 3 State Department cable obtained by the Washington Post.
Authorities have not identified the individual behind the attempts, but claim that he was probably trying to gain access to information or accounts.
So far he has contacted at least five officials, “including three foreign ministers, a U.S. governor, and a U.S. member of Congress.” It is unknown if they responded to the AI generated messages.
The impersonator first used a fake email address to create a Signal account. Signal is a secure encrypted messaging app that is widely used by government officials. Then, he used the phone numbers associated with the officials’ accounts to contact them.
“The actor left voicemails on Signal for at least two targeted individuals and in one instance, sent a text message inviting the individual to communicate on Signal,” according to the cable. He also impersonated other State Department personnel using email.
The FBI had warned government officials in May about an “ongoing malicious text and voice messaging campaign” targeting them and their contacts with voice generated messages in an attempt to gain funds or information.
“If you receive a message claiming to be from a senior U.S. official, do not assume it is authentic,” they said.
That month, someone breached the phone of White House Chief of Staff Susie Wiles and impersonated her in numerous calls to high ranking officials.
The State Department says they will “carry out a thorough investigation and continue to implement safeguards to prevent this from happening in the future.”
Hany Farid, a professor at the University of California at Berkeley who specializes in analyzing deepfakes, told the Washington Post that phishing schemes targeting government officials do not require sophisticated knowledge and are often successful because officials can be careless about data security.
The rise of AI has made classic phishing schemes much easier. Using an AI voice generator, all you need is 15-20 seconds of audio to “clone” someone’s voice and record them saying whatever text you specify.
New AI-assisted cybersecurity schemes are popping up all over. In June Canadian agencies warned citizens of similar AI phishing schemes targeting government officials and CEOs. Ukraine also issued a warning that Russian intelligence agents were impersonating its Security Service to recruit civilians for sabotage missions.
Over 50% of phishing emails are now generated by AI, according to a recent study by Barracuda and researchers from Columbia University and the University of Chicago.
Originally Published at Daily Wire, Daily Signal, or The Blaze
What's Your Reaction?






