Skip to main content

Chayn Research Report · 2026

Content warning: domestic abuse, coercive control

Not Just Nudes: Image-based Abuse in Pakistan

A research report by Chayn

5 chapters · 70+ voices · Chayn Research Team

What is image-based abuse?

Chapter 1

It doesn't have to be a nude photo

It was just me. I was alone, but I was just wearing a sleeveless tank top. That was not meant for anybody else. It was just in my phone. It was my picture.

Nadia

What the 80% represents

A public voice

Ayesha Omar is a well-known Pakistani actor and public figure who shared her experience.

Chapter 2

Log kya kahein gay?

What will people say

How women who have experienced image-based abuse are described in their communities:

ibrat ka nishan

a walking obscenity; a sign of moral ruin

Women who have experienced image-based abuse are openly discussed in families and communities as ibrat ka nishan — their suffering held up as a cautionary lesson. The emphasis is not on the harm she experienced, but on the supposed mistake she made.

Chapter 3

It's about consent, not content

AishaStudent · Peshawar

girls whose images were posted without consent to a public Facebook page

The central argument

What the image shows

Consent

Whether she chose to share it

Platform policies ask what an image shows. Laws ask whether it is sexually explicit. Neither asks whether she gave her consent. That is the gap these two stories expose.

FatimaAt a private gathering

Before

After that night

Someone took that choice from me. It’s not even up to me to decide what is private — someone took that choice and autonomy from me.

— Survivor

Chapter 4

The experience of many

They are not bad pictures… I was not wearing a scarf, but I was wearing whole clothes… but he said he will leak my pictures.

The pictures I’ve taken, the ones where I appear ready or dressed up — those are private. I don’t want them shared without my permission, not even by friends. I don’t want my friends to post or share them. I don’t like that.

That girl just abandoned all of her studies, then she never came to the University again.

A student filmed by accident whilst smoking on campus. When the clip went viral and her father saw it, she left her degree and moved city.

Any picture, even a selfie from one’s own mobile. When you go somewhere and someone captures something of you, even if you are an audience member, it can end up anywhere — for example on their public page. It is your private picture. Nowadays the level of privacy and consent is gone; you can be everywhere. So all video and picture are private unless you give your consent.

When the stakes are life and death

The consequences are fatal and collectively felt

Women in the study described consequences that stretched into every corner of their lives. But several stressed that the stakes could literally be a matter of life and death.

A normal picture can be a big problem. Maybe that much that it can lead to a femicide. Maybe that much that the family is exiled from the society.

Cases of image-based abuse are openly discussed in families and communities by being turned into moral lessons. Such women are talked about as ibrat ka nishan — their suffering held up as evidence of what happens if a girl talks to a boy, shares something private, or steps beyond accepted limits.

Leaving everything behind

One participant described how harassment following the leaking of her pre-transition images led to her needing to leave her university entirely.

I was left with no choice. The precautions of this incident meant I had to leave my university and drop my degree that I had been doing for three years and sort of restart everything from scratch in an entirely different city, in a new place.

Chapter 5

Reporting abuse shouldn't be this hard

Where the law falls short

S.20offences against dignity

Protects against false content only

Section 20 covers offences against dignity — but only when someone shares something false: photoshopped, superimposed, or AI-generated. Real photographs shared without consent receive no protection under this section.

Covered

False, photoshopped, or AI-generated images

Falls through

Real photographs shared without consent

S.21offences against modesty

Covers only sexually explicit images

Section 21 deals with offences against modesty. It requires the image to be ‘sexually explicit’ — but no court has yet defined what makes an image sexually explicit. Ordinary images used to abuse ordinary women fall through this gap entirely.

Covered

Images formally classified as sexually explicit

Falls through

Ordinary images — no court has yet defined ‘sexually explicit’

The scale of legal failure

Source: UN Cybercrime Convention monitoring data. Australia is the only country that explicitly recognises images of a person without clothing of religious or cultural significance as intimate images.

Conclusion

The responsibility lies solely with the person who violated my consent

The path forward

The change we want to see