A couple weeks ago, a bipartisan group of senators, including Sens. Joe Manchin and Shelley Moore Capito, introduced Senate bill 4569: the Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act — or, the TAKE IT DOWN Act.
This bill takes aim at explicit images of, or that resemble, real people and are posted online without the person’s consent. You may be familiar with “revenge porn” — when someone shares, usually online, sexually explicit or nude images of someone else without the subject’s permission. Those images are referred to as “nonconsensual intimate images” or NCII. The TAKE IT DOWN Act not only covers traditional NCII, but also artificial intelligence-generated or manipulated images.
AI has added a whole new dimension to NCII-related crimes. The same programs that can create the already ethically questionable “deepfakes” of people doing or saying things they haven’t done can be taken one step further to create “deepfake pornography”: fake images, videos or audio of real people in sexually explicit scenarios that never occurred. And that technology is increasingly deployed against minors — sometimes by their peers.
Most states already have laws on the books to address traditional revenge porn (real images of real people), but criteria and penalties vary. Only 20 states have laws expressly addressing AI-generated NCII.
West Virginia could have been one of those 20 states, but the Legislature failed to pass any of the bills it had introduced this past year to tackle sexually explicit deepfakes, including a measure targeting AI images of minors. What should have been easily passed bipartisan bills were instead weaponized to leverage support for controversial legislation, ultimately killing all the bills — popular and unpopular alike.
At the moment, the TAKE IT DOWN Act’s full text is not available for review. However, the senators backing the bill have released information about its provisions. The bill makes it unlawful for a person to knowingly publish NCII on social media and other online platforms; clarifies that consenting to the creation of an image is not the same as consenting to have it published; permits the good faith disclosure of NCII, such as to law enforcement, in narrow cases; requires websites to take down NCII within 48 hours of notice from the victim; and requires websites to make reasonable efforts to remove copies of the images.
In the case of NCII, federal legislation is desperately needed. The patchwork of state revenge porn laws means similar offenders in different states can receive vastly different penalties. It also means the recourse available to victims varies from state to state, offering more or less protection based on location. Since fewer than half of states have laws to address AI-generated NCII, it’s even more important to pass a federal law so that all victims have equal protection and all offenders face equal justice. We hope enough members of Congress agree and the TAKE IT DOWN Act passes swiftly into law.