Advocates and victims call for AI deepfake porn legislation now.


TLDR:

Key Points:

  • Lack of federal legislation is allowing AI deepfake porn to ruin the lives of victims, mainly women and girls.
  • Advocates are pushing for federal laws to criminalize non-consensual deepfake pornography.

Lawmakers and advocates are advocating for federal legislation to criminalize AI-generated pornography, commonly known as deepfake porn. Victims, primarily women and girls, are being negatively impacted by the spread of deepfake nude apps. The lack of clear legislation on the federal and state level, according to the director of Image-Based Sexual Violence Initiative, Andrea Powell, is resulting in law enforcement being unable to take action when victims seek help. Powell describes AI deepfake nude apps as virtual guns for men and boys.

The spread of AI-generated, sexually explicit content has rapidly increased, with deepfake sexual content online rising by over 400% between 2022 and 2023. Advocates stress the urgent need for federal regulations to address the issue, as current state laws vary and lack consistency. Representative Joe Morelle has introduced the Preventing Deepfakes of Intimate Images Act, aiming to criminalize the dissemination of non-consensual deepfakes. Additionally, tech companies are being pushed to take responsibility for nonconsensual deepfake content on their platforms.

Victims and advocates are taking matters into their own hands, with individuals such as Breeze Liu creating apps like Alecto AI to track and remove deepfake content online. Liu, a former venture capitalist, became a target of deepfake sexual harassment herself and is now advocating for federal policy changes to combat AI deepfake pornography.

While some tech companies have updated their policies to address nonconsensual deepfake content, advocates stress the importance of federal legislation to effectively combat this issue and protect victims.