This page is a fallback for search engines and cases when javascript fails or is disabled.
Please view this card in the library, where you can also find the rest of the plot4ai cards.
Could the AI system generate or disseminate deepfakes or synthetic media that mislead users, impersonate individuals, or cause harm?
Could the AI system generate or disseminate deepfakes or synthetic media that mislead users, impersonate individuals, or cause harm?
- Generative AI systems can produce highly realistic audio, image, or video content that mimics real individuals or events. When used maliciously or without clear disclosure, this content, commonly known as deepfakes, can be used for identity fraud, political manipulation, reputational damage, harassment, or the spread of disinformation.
- Even when not intended for harm, synthetic content can deceive users if it lacks proper labeling or detection, violating transparency principles and potentially eroding public trust. This risk intensifies in contexts like journalism, education, political discourse, and public safety.
If you answered Yes then you are at risk
If you are not sure, then you might be at risk too
Recommendations
- Apply persistent and tamper-resistant watermarks or metadata tagging to all AI-generated media.
- Inform users clearly and accessibly when they are viewing or interacting with synthetic content.
- Monitor outputs for impersonation or misuse risks, especially when names, likenesses, or real-world events are involved.
- Use or integrate deepfake detection tools to identify and flag manipulated content.
- Establish policy and UX design patterns that discourage deceptive or malicious uses, and allow users to report suspected deepfakes.
- For deployers, ensure compliance with disclosure obligations (e.g. Article 50 of the EU AI Act) when publishing or distributing synthetic media.
- Where feasible, restrict or control access to generative features capable of identity simulation (e.g. voice cloning, face swapping) through friction, licensing, or tiered access.
Interesting resources/references
- Article 50 EU AI Act