Not in recent history has a technology come along with the potential to harm society more than deepfakes.
The manipulative, insidious AI-generated content is already being weaponized in politics and will be pervasive in the upcoming U.S. Presidential election, as well as those in the Senate and the House of Representatives.
As regulators grapple to control the technology, incredibly realistic deepfakes are being used to smear candidates, sway public opinion and manipulate voter turnout.
University of California, Berkeley's School of Information Professor Hany Farid has had enough of all this.
He has launched a project dedicated to tracking deepfakes throughout the 2024 presidential campaign.
In its most recent entry, Farid's site provides three images of President Joe Biden in fatigues sitting at what looks to be a military command center.
The site also references the now infamous deepfake robocalls impersonating Biden ahead of the New Hampshire primary.
The site points out, the shape of Trump's ear is inconsistent with several real reference images.
Over recent months, many other widespread deepfakes have depicted Trump being tackled by a half-dozen police officers; Ukrainian Vladimir Zelenskiy calling for his soldiers to lay down their weapons and return to their families; and U.S. Vice President Kamala Harris seemingly rambling and inebriated at an event at Howard University.
The harmful technology has also been used to tamper with elections in Turkey and Bangladesh - and countless others to come - and some candidates including Rep. Dean Phillips of Minnesota and Miami Mayor Francis Suarez have used deepfakes to engage with voters.
Beyond their impact on voters, deepfakes can be used as shields when people are recorded breaking the law or saying or doing something inappropriate.
Research has shown that humans can only detect deepfake videos a little more than half the time and phony audio 73% of the time.
Deepfakes are becoming ever more dangerous because images, audio and video created by AI are increasingly realistic, Farid noted.
Others offer more concrete and specific devices for spotting deepfakes.
Deepfakes can't always represent natural physics.
Looking at whether glasses have too much glare, none at all, or if glare changes when the person moves.
Paying attention to facial hair and whether it looks real.
While deepfakes may add or remove mustaches, sideburns or beards, those transformations aren't always fully natural.
Look at the way the person's blinking and the way their lips move, as some deepfakes are based on lip-syncing.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact.
This Cyber News was published on venturebeat.com. Publication date: Fri, 02 Feb 2024 00:43:04 +0000