Skip to content
Home » How AI Technology is Making Financial Scams Harder to Detect

How AI Technology is Making Financial Scams Harder to Detect

  • Published
  • 5 min read
Learn how AI is changing financial scams.

Artificial Intelligence (AI) and Machine Learning (ML) technology are revolutionizing how we live and work. In many ways, this is exciting. After all, there are near-daily announcements about how advancements in AI technology are changing the fields of healthcare, finance, education, and environmental protection.

Unfortunately, for all the positive impacts AI and ML technologies have had on our lives, there is a dark side. As the line between human and machine continues to blur, these life-saving technologies usher in a wave of increasingly sophisticated financial scams.

Using AI, scammers can now make shockingly convincing phone calls and FaceTime calls to unsuspecting victims. They pose as family members, close friends, financial institutions, or even government agents from the IRS. What’s more, experts warn that it’s no longer just the most gullible among us who are falling victim to such schemes. According to new data from the FTC, people in the United States reported losing almost $8.8 billion to fraud in 2022 alone—up 30% over the previous year.

A Hacker’s Toolbox

The best way to protect yourself is to be skeptical of phone calls and texts, but that’s not always enough. So, we’ve got a few more tips to help you safeguard yourself and your money. First, here are some of the things today’s hackers are using to scam money off unsuspecting victims:

Basic Information

This isn’t new information, but hackers only need basic information—like a relative’s name, your birth date, or places you have lived—to spin a convincing story. Unfortunately, much of that key information can be found online with very little effort. All it takes is a quick exploration of sites like BeenVerified, Spokeo, PeopleFinders, Ancestry.com, or Classmates.com. Most often, however, hackers pull vital information straight from your social media posts.

Caller ID Spoofing

Spoofing tools allow hackers to change the caller ID information on your phone screen. This makes it look like the call originates from a trusted source. With this technology, scammers can mimic government agencies (like the IRS), financial institutions like Maps, or even your dear old Aunt Mary.

AI Voice Generators

Advancements in AI voice generation now allow hackers to impersonate almost anyone convincingly—all it takes is a short audio clip. The AI algorithms can analyze voice recordings and mimic the unique characteristics of a person’s speech patterns, tone, and intonation. They can then use these sophisticated voice morphing tools to create highly realistic impersonations, making it difficult for individuals to discern whether they are speaking to a real person or a scammer.

Deep Fake FaceTime Calls

With software not unlike a TikTok or Snapchat filter, creative scammers can generate videos or FaceTime call deepfakes that are hard to decipher from the real thing.

Tech-Driven Authority and Organization Impersonation

Hackers can use modern AI technology to impersonate law enforcement agencies, government bodies, financial institutions, or major corporations. They may call and claim you have legal matters, outstanding debts, or other urgent issues that require immediate action.

How to Protect Yourself

Whether you are a tech-smart teen or a grandparent with a landline, you need a combination of awareness, skepticism, verification, and knowledge to avoid AI phishing scams. Here are a few proactive steps you can take to protect yourself from falling victim to savvy scammer’s tactics.

1. Keep your personal information private

Set your social media accounts to private and monitor what you post online. Google yourself (use parentheses around your name) to find out what personal data of yours is readily available online.

2. Establish a code word

Protect yourself from potential scammers in advance by setting up a family code word—a word or phrase only you and your family members know—and ask the caller for the code word before you do anything else.

3. Look for inconsistencies

If the face of a person you are FaceTiming with seems asymmetrical, blurry, or warped, ask to call them back. End the call. Do not allow them to initiate the callback. If a loved one sounds strange in a phone call, doesn’t recall personal information, or says things that seem out of character, end the call and call them back.

4. Verify before you act

Don’t take immediate action if you get a call, text, or email from a loved one asking for money or help. First, verify the situation through a different contact method. If you can’t reach your loved one, try to contact them through a friend, coworker, or other family member.

5. Beware of urgent requests and unusual payment methods

Scammers often use fear, urgency, and a sense of panic to get what they want. These psychological techniques manipulate their targets into taking action before rational thought sets in. If a caller asks you to wire money, send cryptocurrency, buy gift cards, or give them your card numbers or PINs, consider it a red flag. If the caller asks you to send large sums of money through a courier or asks you to lie to your financial institution about why you are withdrawing a large sum, report the activity to your bank or credit union as well as the local police, the FBI, and the Federal Trade Commission (FTC).

6. Trust your instincts

If something feels off—assume that it is and take precautions. And, if you suspect that you have been scammed, contact Maps directly at 503.588.0181 and report the activity.

You are now leaving Maps Credit Union

Modal called incorrectly.