TECHNOLOGY

An awesome new AI app converts women into porn videos with one click


From the beginning, DeepFax, or AI-made synthetic media, has been used primarily to create pornographic representations of women, who often find this psychologically destructive. The original Reddit creator who popularized the faces of tech-savvy female celebrities in pornographic videos. To date, research firm AI Sensitivity estimates that between 90% and 95% of all online diphock videos feature pornographic content and about 90% female.

As technology has evolved, so have a number of easy-to-use code-tools that allow users to “undress” women’s clothing in pictures. Many of these services have since been forced offline, but the code still exists in open source repositories and is being redesigned. The latest such site received more than 6.7 million visits in August, according to researcher Genevieve Oh, who discovered it. It has not yet been taken offline.

There are other single-photo face-switching apps like ZAO or ReFace, which keep users in selected scenes from mainstream movies or pop videos. But as the first dedicated pornographic face-switching app, Y took it to a new level. Creating pornographic images without human consent is “tailor-made,” said Adam Dodge, founder of EndTab, a non-profit organization that educates people about technology-backed abuse. This makes it easier for developers to improve technology in this specific use and tempts people who would otherwise think of making deepfake porn. “Whenever you become such an expert, it creates a new corner of the Internet that will attract new users,” Dodge said.

Y is impossibly easy to use. Once a user uploads a face image, the site opens a library of porn videos. The vast majority features women, although a small handful also features men, mostly lesbian porn. A user can select any video to preview the face-swap results in seconds – and pay to download the full version.

The results are far from perfect. Many of the facial expressions are clearly fake, with the faces glowing and distorted as they rotate at different angles. But to a casual observer, some may pass quite finely, and Dipfeck’s trajectory has already shown how quickly they can deviate from reality. Some experts argue that the value of dipfec is also not really important because the emotional stress on victims can be similar. And many members of the public are unaware that such technology exists, so even low-quality facial expressions can be able to fool people.

To date, I have never been completely successful in taking any pictures. Forever, that will be there. It doesn’t matter what I do.

Noel Martin, an Australian worker

Y bills itself as a safe and responsible tool for exploring sexual fantasies. The language of the site encourages users to upload their own faces. But nothing prevents them from uploading other people’s faces, and comments in online forums indicate that users are already doing so.

The consequences for women and girls noticed by such activities can be devastating. On a psychological level, these videos may feel as a violation of revenge porn – real intimate videos are filmed or released without consent. “This kind of abuse – where people misrepresent your identity, name, reputation and change it in such a disgraceful way – breaks you down to the core,” said Noel Martin, an Australian activist who has been targeted by a Deepfake porn campaign.

And the reactions can last a lifetime with the victims. Photos and videos are hard to remove from the Internet, and new content can be created at any time. “It affects your interpersonal relationships; it affects the way you get a job. Every time you go for a job interview, it can be raised. Possible romantic relationships,” says Martin. “Forever, that will be there. It doesn’t matter what I do.”

Sometimes revenge is more complicated than obscenity. Because the content isn’t real, women may doubt whether they deserve to feel hurt and whether they should report it, Dodge said. “If someone is wrestling with them to see if they are really fighting, it hinders their ability to recover,” he said.

Disagree deepfeck porn can have economic and career implications. Rana Ayub, an Indian journalist who was the victim of a Deepfake porn campaign, later received such intense online harassment that she had to reduce her online presence and thus needed a public profile to do her job. Helen Mort, a UK-based poet and broadcaster who previously shared her story with the MIT Technology Review, said she felt pressured to do the same after discovering that her pictures were stolen from personal social media accounts to create fake nudity.

The UK government-funded Revenge Porn Helpline recently received a lawsuit from a teacher who lost her job after her dipfeck pornographic image spread on social media and brought her school to the attention of Sophie Mortimer. “It’s getting worse, not better,” Dodge says. “More women are being targeted in this way.”

According to Azdar, homosexuality poses an additional threat to men in countries that criminalize it, albeit to a limited extent. This has happened in 711 jurisdictions worldwide, of which 11 offenses carry the death penalty.

Azdar, who has discovered a number of deepfake porn apps over the past few years, said he tried to contact Y’s hosting service and insisted on offline. But he is pessimistic about preventing similar tools from being made. Meanwhile, another site has popped up that seems to be trying the same thing. He thinks banning such content from social media platforms, and possibly making their creation or use illegal, would prove a more sustainable solution. “This means that these websites are used just like the Dark Web material,” he says. “Even if it’s driven underground, at least it keeps it out of the eyes of everyday people.”

Y did not respond to multiple requests for comment in a press email listed on its site. Registration information related to the domain has been blocked by the Privacy Service for privacy. On August 17, after making a third attempt to reach the MIT Technology Review creator, the site posted a notice on its homepage that it was no longer available to new users. As of Sept. 12, the notice was still there.



Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button