The widespread manipulation of sexual content in the form of photos and videos using artificial intelligence (AI) features has triggered public concern. This technology has been misused to instantly and massively create fake sexual images and videos. Ironically, such sophistication has intensified objectification by positioning women as the primary victims of visual exploitation.
The more dangerous case is the sexual exploitation of children in digital spaces, which is really alarming and continues to rise.
A 2024 report recorded Indonesia as ranking third globally with 1,450,403 cases, placing the country among those with the highest number of online pornography cases.
Gender studies academic from UGM Graduate School (SPs), Dr. Ratna Noviani, emphasized that advances in AI features are not merely technological progress, but also a new tool for perpetuating systemic gender-based violence.
“Digital spaces, often glorified as spaces of ‘freedom’, ultimately become new arenas for the reproduction of gender-based violence, including sexual violence targeting women. This not only perpetuates existing forms of violence, but also creates new ones, such as online gender-based violence that is massive, anonymous, and difficult to stop,” she explained on Friday (Jan. 9).
Dr. Noviani highlighted women’s growing unease in the digital era, as digital spaces now function as a double-edged sword.
On one hand, social media provides space for women to appear, speak, and assert their presence. There is a sense of freedom for women to build visibility and demonstrate agency.
However, at the same time, this visual presence is highly vulnerable to being used as “raw material” for digital sexual violence.
“Here we see a fundamental contradiction: women are encouraged to be present and visible in digital spaces, yet that very visibility makes their bodies and images objects to be controlled, exploited, and attacked,” she stressed.
Regarding the increasing use of AI to manipulate content, Dr. Noviani views this phenomenon as an evolution of voyeuristic culture.
She observed deeply imbalanced gender-based power relations, noting that women have long been positioned through the logic of the male gaze as objects of observation, sexual objects, and spectacles for male pleasure.
“Morphing practices do not change this male gaze logic; instead, they refine and perpetuate it in increasingly polished, realistic, and invasive digital visual forms,” she explained.
Furthermore, the Head of the Master’s Program in Cultural and Media Studies elaborated on her findings that digital and AI technologies are not neutral.
These technologies are built from data, designs, and social imaginations that are themselves imbued with masculine bias.
Even in everyday practice, AI assistants are often gendered as feminine, with names, voices, and characteristics representing compliance and service.
“This shows that the logic of technology itself has long reproduced women’s position as objects. Therefore, digital visual violence such as morphing is not merely an evolution of voyeurism, but a continuation of the same structural problem,” she explained.
In an effort to break this cycle, Dr. Noviani proposed solutions centred on building collective awareness among digital media users.
She noted that the public must realise that liking, commenting on, or sharing AI-manipulated content can make someone a secondary perpetrator.
She urged society to be more critical and discerning, and to refrain from spreading such false content.
“Building collective awareness means shifting our position from passive spectators to active participants in solutions. Together, we must become critical users of digital and AI technologies, fully aware that every click, like, and share carries ethical and political consequences,” she concluded.
Author: Aldi Firmansyah
Editor: Gusti Grehenson
Post-editor: Rajendra Arya
Illustration: Freepik