
Check out the featured post and read more here: https://www.christianity.com/wiki/current-events/how-the-defiance-act-protects-the-vulnerable-in-the-age-of-ai.html
Growing up, my parents warned me to never post photos on the internet. Before the days of social media, this was pretty easy and avoidable. As I grew older, however, and received an Instagram account, I learned that I could post photos, but they should always be appropriate. As we all know, once something hits cyberspace, it’s no longer “ours.” It can be found virtually anywhere, even if we delete it.
Today, we live in a different world. With AI, social media, and technological advancements and obsessions, graphic images can easily be found anywhere. Even when we’re not looking for it, it often pops up. But imagine discovering explicit images of yourself online that you never took, consented to, or even knew existed. Scary, right?
According to Omny Miranda Martone, Founder and CEO of the Sexual Violence Prevention Association (S.V.P.A.), these trends have been growing for the past two years. Before this time, non-consensual AI deepfakes were more challenging to create and mass-promote. You needed a high-tech computer, skills in coding, and hundreds of photos of the victim’s face. While we may have seen bits and pieces of this abuse through public figures, this AI-generated deepfake pornography has spread to even ordinary, everyday individuals. Over 98% of deepfakes online are explicit. This isn’t just technology drama, but digital sexual violence, Martone believes, needs to be taken seriously.
If every person bears God’s image, how should we respond when technology is used to distort, exploit, or weaponize that image — and what responsibility do we carry to advocate for laws that protect the vulnerable?
The Dangers of AI
First, we need to understand that while we shouldn’t throw around assumptions or mislabel the benefits of technology, AI tools are being used to create realistic images and videos (explicit ones) of real people without their consent. The victims have included women, men, teens, public figures, and now private citizens. But the worst part about these images is that they’re spreading rapidly and often anonymously, with no way to get them back or permanently take them down.
As a result, the harm victims are having is wide and deep—emotional, relational, professional, and spiritual. Unlike traditional crimes, this violation feels invisible, but permanent. “Now, anyone can make these images in seconds, and they only need one or two images of the victim’s face,” said Martone. Getting images off of company websites, LinkedIn, and social sites, it’s evident that S.V.P.A.’s response isn’t prudishness or panic. This is about justice, rights, consent, dignity, respect, and protection.
The Legal Gap
Currently, no clear federal civil recourse for victims of nonconsensual deepfake pornography exists. This is where S.V.P.A has sprung into action. Why? Because 1. victims struggle to remove content or seek justice themselves, and 2. the absence of consequences fuels the growth of the problem. With technologies on the rise, our laws are often too slow to follow. Vulnerable people are paying the price, but shouldn’t.
In a plea for justice, S.V.P.A. begs individuals to tell Congress to pass the DEFIANCE Act. This act was drafted through work with victims and Congress to provide a civil right of action for victims, enabling them to seek justice. By definition, this Bipartisan legislation aims to deter the creation and distribution of nonconsensual deepfake pornography. And as of January 13th, 2026, it was passed unanimously in the Senate. The goal of this Act isn’t to restrict freedom of speech, creativity, or technology, but just the opposite: to protect those who use these means from digitally manufactured sexual exploitation.