Today : Jan 12, 2026
Technology
15 December 2025

Dominatrix Turned Tech Founder Fights Revenge Porn

After suffering intimate image abuse, Madelaine Thomas launches Image Angel to help victims reclaim privacy and hold abusers accountable.

Madelaine Thomas never imagined her life would take such a dramatic turn. At 37, the Monmouthshire native had spent a decade working as a professional dominatrix, finding empowerment and fulfillment in her role. But when clients began leaking her private, explicit images without her consent, the humiliation and distress pushed her in an entirely new direction—one that would put her at the center of a technological battle against revenge porn.

“These were beautiful pictures, I’m not ashamed of the pictures, I’m ashamed of the way that they were used against me by someone who I don’t know,” Madelaine told the BBC, her voice carrying the weight of someone who’s seen both sides of digital vulnerability. The experience was more than a personal violation; it was a wake-up call to the inadequacies of existing protections for victims of intimate image abuse—a crime that affects an estimated 1.42% of the UK’s female population annually, according to the Revenge Porn Helpline.

The emotional toll was immense. Public shaming and the sense of exposure left Madelaine angry enough to do something about it. “I think a lot of people will say, ‘you put a saucy picture out on the internet, what do you expect?’” she said. “I expect dignity, I expect respect, and I expect trust, and I don’t see why those are negotiable.”

So, she did what few would dare: she founded Image Angel, a tech company aimed squarely at combating revenge porn. Launched in 2025, Image Angel deploys invisible forensic watermarking technology to embed unique, undetectable identifiers into images whenever they’re accessed online. This watermark survives screenshots, edits, and even secondary photography, ensuring that if an image is later shared without consent, the abuser can be identified by a data recovery specialist.

“This technology already exists in Hollywood, it already exists in sports broadcasting so this is not brand new technology, it’s just a new application and a new system,” Madelaine explained to the BBC. “And we’ve tested it, we’re partnering with a company that has 30 years experience in developing technology so we know that this is solid and what we now need to do is test it at scale.”

It’s a remarkable pivot for someone with no formal tech background. Madelaine admitted, “I know that it’s bizarre, it’s crazy to think that someone who was a dominatrix is now a founder of a tech company, but it took someone who has been through it to know the loopholes and the changes that needed to happen.” She credits countless sleepless nights, relentless research, and the willingness to “bug people” who understood the technology for her success.

Image Angel’s impact has already been recognized. The platform was recommended as best practice in Baroness Bertin’s independent pornography review earlier this year, and it has won several awards for innovation. One online platform has already adopted the technology, with talks underway for broader adoption across the digital landscape.

But Madelaine’s solution is just one part of a larger, multi-layered response to a growing crisis. Intimate image abuse, often called revenge porn, is a criminal offense in the UK, carrying penalties of up to two years in prison. Yet for many victims, the trauma extends far beyond the legal consequences for perpetrators. The digital age has made it easier than ever for private images to be distributed without consent, often leaving victims feeling powerless and exposed for years.

Kate Worthington from Southwest Grid for Learning’s (SWGFL) Revenge Porn Helpline has seen firsthand the panic, distress, and self-blame that intimate image abuse causes. “If that self-blame is reinforced by a misinformed friend or service who says ‘well, why did you take those images in the first place?’ that self blame can really be reinforced so it’s really important that the response that somebody is provided with is that they have not done anything wrong,” she said. Worthington praised Madelaine’s efforts, emphasizing the need for a multi-layered approach to tackling tech-facilitated gender-based abuse. “No one tool is going to be able to tackle this alone, no one helpline, it needs to be this multi-layered response.”

SWGFL itself operates StopNCII.org, a global tool that generates hashes from intimate images and videos. These hashes are shared with participating companies to help detect and remove the illicit content from being spread online—a complementary approach to Image Angel’s forensic watermarking.

Madelaine’s platform goes further, integrating advanced image recognition algorithms capable of scanning not just mainstream social media but also obscure forums where leaked images often surface. When a user uploads a copy of their leaked image, the system searches for identical or similar content across the web. Upon detection, takedown requests are automated and sent to websites, social media platforms, and hosting providers. The process is designed to be user-friendly, prioritizing victim anonymity and data security at every turn.

But the platform doesn’t stop at technical solutions. Recognizing the profound psychological impact of non-consensual image sharing, Image Angel also connects victims with emotional support resources and legal guidance. “It isn’t a crime to consensually send an image to someone,” BBC presenter Jess Davies, herself a survivor of intimate image abuse, told the BBC. “But it is a crime to distribute that without consent and I think that should always be where the blame is.”

The fight against revenge porn is complicated by new technological threats, including AI-generated deepfake images that can be even harder to trace and remove. Legal frameworks are evolving, with many countries imposing severe penalties for non-consensual image sharing. Yet, enforcement remains a challenge due to the borderless nature of the internet and the speed at which content can proliferate. Advocacy groups are pushing for stronger, more uniform laws, better cross-border enforcement, and legislation that addresses the unique dangers posed by deepfakes.

Madelaine’s journey from victim to tech founder is emblematic of a broader trend: those most affected by digital harms are increasingly stepping up to drive innovation and change. Her work offers hope—not just through the promise of more effective content removal, but by shifting the stigma away from victims and onto perpetrators. “The fact that those images could be then shared around where I live or with people I love and used to hurt them, that’s beyond, that’s not my choice, that’s not my mistake, that’s someone being an abuser,” she said.

As Image Angel continues to expand and refine its technology, and as support systems like StopNCII.org grow, the future looks a little brighter for those seeking to reclaim their dignity and safety online. It’s a testament to the power of resilience, innovation, and the unyielding demand for respect in the digital age.