Scams targeting Americans are surging. Since 2020, the FBI has received 4.2 million+ reports of fraud, that’s $50.5 billion in losses.
Imposter scams in particular are on the rise in the age of artificial intelligence (AI). Criminals are using deepfakes, or media that is generated or manipulated by AI, to gain your trust and scam you out of your hard-earned money.
Deepfakes can be altered images, videos or audio. They may depict people you know — including friends and family — or public figures including celebrities, government officials and law enforcement.
How To Detect a Deepfake
Look For Inconsistencies:
- Are any of the facial features blurry or distorted?
- Does the person blink too much or too little?
- Do the hair and teeth look real?
- Are the audio and video out of sync?
- Is the voice tone flat or unnatural?
- Does the visual show odd or unnatural shadows or lighting?
Tips to Stay Safe
- STOP AND THINK. Is someone trying to scare you or pressure you into sending money or sharing personal information?
- VERIFY the legitimacy of people and requests by using trusted numbers, official websites and online reverse image/video search tools.
- CREATE CODEWORDS or phrases with loved ones to confirm identities.
- LIMIT YOUR DIGITAL FOOTPRINT. Photos, voice clips and videos can be used to train deepfake models.
- NEVER REPOST videos or images without verifying the source.
Click to view the Deepfake Media Scam Infographic