Key takeaways:

  • Fraudsters are using artificial intelligence (AI) to target Canadians, especially seniors, for identity theft and financial scams. Fraudsters can use AI to mimic voices, create fake personas or videos and clone official websites to deceive people.
  • Scammers are enhancing old tricks, like “grandparent scams” or romance scams, with AI technology, making them more convincing and effective.
  • Common scams targeting seniors that are increasingly using AI  include voice scams, grandparent scams and romance scams.
  • Tips for avoiding AI scams include staying skeptical, verifying identities, never sending money to people you’ve never met, securing devices and staying informed about common scams.

Gone are the days when fraudsters relied solely on telemarketing scams, robocalls or phishing emails. New technology and the rise of artificial intelligence (AI) has given them new ways to scam Canadians, especially seniors and vulnerable people.1

In a nutshell, artificial intelligence technology enables computers and machines to do things that once required human intelligence. For instance, this can include understanding language, recognizing patterns, making decisions and learning from experience.

Unfortunately, AI scammers are tapping into this technology to deceive or trick people. From fake phone calls with cloned voices to online scams using chatbots, cybercriminals are using AI to update and enhance their old methods. In too many cases, these tech-savvy techniques are proving to be highly effective.

Read on to learn some of the most common types of AI scams — and tips for how to avoid them. 

Common types of AI scams targeting seniors

Nowadays, it’s not just cyber fraudidentity theft and scams such as home improvement scams that seniors need to watch out for. AI scams are on the rise and it’s important to be aware of the - common types of these scams targeting seniors.

Voice scams using AI

When you received a scam phone call in the past, you might’ve heard a robotic voice, realized it was fake and hung up. 

But what if the voice on the other end of the line sounds human and like someone you know? This is an example of a deepfake voice scam enabled by AI technology. Fraudsters clone voices and trick people into handing over money or personal details. The scammers often know where their target lives or the names of family members.2

Security experts claim this voice scam is easy, cheap, and convincing — and on the rise. Fraudsters only need a short audio clip, often pulled from social media, to copy someone’s voice. 

Grandparent scams enhanced by AI

In traditional grandparent scams, a senior gets a call from someone who pretends to be their grandchild in trouble. The caller asks for money to deal with an emergency, maybe claiming to be in jail or stranded in another country but doesn’t want anyone else to know. Sometimes, a second person poses as a lawyer or police officer and joins the call to demand payment or bail money right away.

Deepfake technology takes grandparent scams to a new level. The calls are more convincing, and some scammers will even use fake AI-generated photos to help make their case.3

Fraudsters might also reach out via email, text, or social media, pretending to be a grandchild asking for money or gift cards.4 In one instance, the imposter claims their phone is broken and needs to share their “new” phone number. They text or message through social media to ask for money to repair the broken phone or pay a bill.5

Romance scams with AI chatbots

More and more seniors are getting online to meet new people, especially if they’re looking to start dating again. But it also makes them targets for romance scams, which are running rampant, according to the Canadian Anti-Fraud Centre.

With romance scams, a scammer lures you into a cyber relationship through email or fake profiles on social media or dating sites. They seek to gain your trust and affection — and money. Scammers often go after seniors who are recently divorced or widowed, because this demographic may be vulnerable and have money on hand.

AI has made it easier for romance scams to look believable. Scammers are tapping into tools like voice simulators, face generators and deepfakes to create faux but realistic videos with existing images or footage.6 They may then use these phony videos or images to pose as loved ones or romantic interests, and to convince you to send money or share personal details.

Scammers also misuse AI chatbots, like ChatGPT, which can generate text responses that sound human. It makes romance scams even tougher to detect, as these chatbots can have longer conversations, express emotion, and change how they respond based on how you react. Gaining your trust and making sure you’re attached can help scammers succeed.7

How to detect and prevent AI scams

While AI scams are on the rise, there are a few things you can do to avoid these kinds of scams:

  • Stay skeptical. Don’t trust random calls or messages asking for personal details or money, even if they claim to be a loved one. Be suspect of anyone who’s pushy, demanding or in distress. Be wary of friendly or romantic advances from people you meet online, especially if they seem too good to be true or the relationship escalates quickly. When in doubt, hang up the phone or report the messages or account.

  • Verify identities. Call the person who allegedly contacted you and check out the story. Make sure you use a phone number you know for sure is correct — not the one the caller gives you. If the caller claims to be a law enforcement official, hang up and contact your local police directly. 

  • Code word. Establish a code word with your family and trusted friends, something that is not available on social media. If you receive an unusual voice or video call from them, ask for the code word to validate the legitimacy of the call or reach out to them through a different means of communication to confirm its truly them contacting you.

  • Watch out for audiovisual red flags. Look closely for inconsistencies in lighting, shadows or reflections; unnatural movements, expressions or gestures; and mismatched lip-syncing. Also, be alert to audio inconsistencies, like background noises that don’t match the environment or an unnatural tone and pitch. Other signs that something isn’t right is when there are misaligned or mismatched features (such as eyes, mouth or teeth), poor quality visuals, blurriness or glitches.

  • Never send money. Scammers often demand that you send money through ways that make it hard to trace or retrieve the funds. For instance, they may ask you to wire money, send cryptocurrency or buy gift cards. Remember, the Canadian Criminal Justice System doesn’t allow someone to be bailed out of jail with cash or crypto.8

  • Converse with caution. Be wary of rapid-fire online chats, as AI-generated responses may answer back quickly. If an online chat feels weird or “off,” you may be talking to a chatbot. AI-generated text or chats often contain repeat phrases or answers that have nothing to do with what you’re talking about.9

  • Secure your devices. Protect your computer, smartphone and tablets with up-to-date security software. Use strong passwords and don’t click on links that seem suspicious.

  • Safeguard sensitive information. Don’t share personal details or documents with anyone over the phone, text message, social media, or the internet. Your financial institution will never ask for personal or financial information like account numbers, PINs, one-time passcodes or passwords through email or text message.

  • Be in the know. The Canadian Anti-Fraud Centre has a wealth of information about recent scams and fraud, including AI. Knowing the signs of fraud can help you spot and avoid a scam immediately.

  • Ask for help. If you’re not sure about something, ask a trusted family member or friend for advice. Don’t be afraid to report suspicious activity to the police — it might stop scammers from tricking others.

Impact of fraud on your finances and health

The impact of fraud on your finances can be significant. The Canadian Anti-Fraud Centre reported that more than $9.2 million was lost by seniors to emergency-grandparent scams in 2022 — a huge uptick from $2.4 million in 2021.10

While the impact of financial losses can be substantial, there are other reasons to stay vigilant. Fraud can also affect your:

  • Emotional well-being: Falling for a scam can rattle anyone. When seniors learn they’ve been tricked, they may feel betrayed, embarrassed, or anxious.

  • Health: The stress of losing money and feeling vulnerable may make physical or mental health problems worse, or even trigger new ones.

  •  Trust: Falling for a scam may make it hard to trust others. Seniors may withdraw from social interactions out of shame, making feelings of loneliness and isolation worse.

What to do if you've been scammed

If you or a loved one has been scammed, take these steps:

  • Call the authorities. If your gut tells you something isn’t right, consult local police right away. You can also file a report with the Canadian Anti-Fraud Centre.

  • Notify financial institutions: Tell your bank and credit card companies about the fraud to prevent further losses.

  • Get legal advice. Consult a lawyer to learn your rights and explore legal options.

  • Inform the credit bureaus. Ask TransUnion and Equifax to put a fraud alert on your credit report. Check your credit report on a regular basis.

  • Keep an eye your accounts. Monitor your bank accounts, credit reports and other financial statements. Report any suspicious activity related to Scotiabank accounts right away.

  • Seek support. Reach out to family members, friends or support groups for emotional support. Remember, the blame primarily lies with the scammers.

Bottom line

If you’ve been scammed, you may feel embarrassed and want to ignore it. But it’s crucial to tell someone — reporting the scam can protect you and others from falling for the same trickery. 

Learn how to protect yourself from scams today