As the Canadian government moves to criminalize the distribution of sexualized ‘deepfakes,’ advocates are urging for more robust measures, including expedited removal processes for such images. The proposed Bill C-16 aims to amend laws against the non-consensual sharing of intimate images to encompass AI-generated visual representations, responding to concerns raised by police and victims’ groups. Michelle Abel, a representative from the National Council of Women of Canada, highlighted the need for immediate actions similar to the U.S. Take It Down Act, which mandates social media companies to remove non-consensual images within 48 hours. While some provinces have adopted laws allowing civil suits for victims of “revenge porn,” the process for image removal can be lengthy. Experts emphasize the necessity of holding online platforms accountable for swiftly addressing reported harmful content, as delays can exacerbate the impact on victims.
Why It Matters
The issue of non-consensual sharing of intimate images has gained increasing attention globally, with legislation in other countries indicating a trend towards stricter controls on online harms. The U.S. Take It Down Act exemplifies a proactive approach, while Canada previously attempted to legislate similar measures under Bill C-63, which was halted. The proliferation of AI-generated content complicates the landscape, necessitating updated legal frameworks to protect victims effectively. As technology advances, the urgency of addressing these challenges becomes more pronounced, highlighting the need for comprehensive legal solutions that include prompt removal of harmful content from online platforms.
Want More Context? 🔎
Loading PerspectiveSplit analysis...