Can AI be used to support male survivors of sexual harms?

We Are Survivors is embracing the technology.

Can AI be used to support male survivors of sexual harms?

For our podcast, How To Date Men, we caught up with Grahame Robertson and Daniel Griffin from We Are Survivors to talk about how the organisation's new app aims to support male survivors of sexual harms.

In the conversation, we talk about the power of AI, how the law keeps up with developments in technology, and why it may not always be obvious who the victims of sexual harms may be.

Listen to the episode

We Are Survivors has recently launched a new app. It's called We Are Okay. What does the app provide to male survivors of sexual harms?

Grahame: It's specifically tailored towards the needs of male survivors of sexual harm. It has a huge amount of really practical interactive mental health information - that can be something as simple as just reading about an issue or you can take part in things like guided meditations. If you can't sleep, there are videos on the platform that will help you feel calm and get some sleep.

0:00
/4:26

The app also has an AI chatbot who we've called Mick. It's been an interesting experience working with Mick. In effect, we're managing our first ever AI member of staff. We had to train Mick on how to talk like We Are Survivors talks. We're not for a second pushing this as an alternative to human one-to-one contact. This is not about us negating responsibility of engaging with male survivors. It's a tool, as AI should be.

Do you think the law is keeping up or evolving quickly enough in response to things like the emergence of AI?

0:00
/2:59

Daniel: The law has kept up. Greater Manchester Police was the first to bring charges against someone who used AI to create indecent images - they were sentenced to 24 years. This was back in 2024.

Hopefully that's now opened the doors where police and the law can go. We're now coming into a territory where we may not have human victims or survivors - it's always evolving.

If there are no human victims in the creation of those AI images, where does the harm lie? What are the police trying to prevent?

0:00
/4:07

Daniel: They're trying to prevent indecent images of children because whether they are of human or AI generated, it's still an illegal activity - there is still a victim because AI is generated from real images. So there's still going to be a victim at some point in the process.

0:00
/3:30

If someone has been abused or wants to get some help with an experience that they're struggling with, what advice or guidance would you give them?

Daniel: It's not your fault. That's the first thing. There are organisations here to support you. WeAreSurvivors.org.uk are based in Greater Manchester. There are other organisations based in England.

It's okay not to kind of reach out at first - you will be believed when you reach out to us. There is also the national 24 hours support helpline that can be contacted. Don't think for one moment you'll ring up and be judged or anything like that. You phone when you need to phone, put the phone down on us multiple times. It's okay.

The 24/7 support helpline is 0808 500 2222 - that can be accessed at any time. Reach out when the time is right. We're here to support you - if we can't support you locally, we will find an organisation somewhere that can. You're not alone in this.


The NSFW edition

If you want to admire some man-on-man action, our NSFW edition gives you every inch.

Sign in and check out our NSFW content - it's free!

Marcus McNeil and Angel Elias for CockyBoys
Fuel for our fantasies.