Requirement 5 of 8

Deepfakes

Explain what deepfakes are, how they can harm people, and what actions to take if someone is affected.

Sign in or create an account to mark steps complete and save your progress.

Checklist

Requirement 5 discussion guide

Use these notes to explain what deepfakes are and why they can be harmful.

What makes a deepfake different

  • A deepfake is media that has been generated or changed by AI to make it look or sound like a real person did or said something they did not actually do or say.
  • Deepfakes can involve video, audio, images, or even realistic-looking fake messages and interviews.
  • They are especially dangerous because people often trust photos, recordings, and video clips as evidence.

Understanding deepfakes

Key ideas

  • A deepfake is artificially generated or altered media made to look or sound real.
  • Deepfakes can harm reputation, spread false information, create confusion, or be used for bullying or fraud.
  • If someone is affected, save evidence, report it to a trusted adult or platform, avoid resharing it, and seek help from the right people.

Examples scouts can understand

  • A fake video that makes it look like a student said something embarrassing or harmful.
  • A fake audio clip that sounds like a parent, teacher, or leader giving instructions that were never actually given.
  • An altered photo that places a person's face into a scene they were never part of.
  • A fake celebrity or public figure video used to spread false information or scam people.
  • A fake voice message that tries to trick someone into sending money or sharing private information.
  • A deepfake made to bully, embarrass, or damage the reputation of someone at school or online.

Why these examples are harmful

  • They can make people believe false information.
  • They can damage trust between friends, families, schools, or organizations.
  • They can cause emotional harm, embarrassment, fear, or reputational damage.
  • They can be used for fraud, blackmail, harassment, or impersonation.

Warning signs

  • Lip movements or facial expressions that look slightly unnatural.
  • Audio that sounds strange, overly smooth, or mismatched with the situation.
  • Lighting, shadows, blinking, or head movement that seems inconsistent.
  • Content designed to create panic, urgency, or strong emotions before people have time to check whether it is real.

What to do if someone is impacted

  • Do not reshare or spread the deepfake further.
  • Save screenshots, links, usernames, timestamps, or other evidence.
  • Tell a trusted adult, parent, counselor, teacher, or platform right away.
  • Report the content to the app, website, or service where it appeared.
  • If the content involves threats, harassment, or serious harm, seek help from the appropriate authorities or support channels.

Deepfakes discussion locked

Sign in or create an account to mark progress complete.

What to do if someone is affected

Describe what actions to take if you or someone you know is impacted by a deepfake.

Immediate steps

  • Do not reshare the deepfake or help it spread.
  • Save evidence such as screenshots, usernames, timestamps, and links.
  • Tell a trusted adult, counselor, parent, teacher, or platform right away.

Follow-up actions

  • Report the content to the app or website where it appeared.
  • If there are threats, bullying, fraud, or serious harm, get help from the appropriate authorities or support channels.

What to do if someone is affected locked

Sign in or create an account to mark progress complete.

Back: Ethics in AINext: Developing AI Skills

Jump To A Requirement

Navigate anywhere in this merit badge without losing your place.

View Start Page