1

Loading...

вторник, 15 мая 2018 г.

The actress Gal Gadot, protagonist of "Wonder Woman", sexing? The implications are terrifying.



We're all fucked up: AI now glues everyone's face in porn videos


There is a video of the actress Gal Gadot having sex with her half-brother on the internet, but there is one detail: it is not the body of Gadot. It is a collage of the artist's face with a body not so far from her own in a preexisting incestuous scene. The video was created using a machine learning algorithm. The creator of the harbinger used easy-to-access materials and open source commands that anyone with the least knowledge of the technology can do.

The material does not fool anyone who reviews it carefully. Sometimes the tracking of facial movements does not work very well and rolls that half uncanny valley. Looking first though, the thing until it works. Even more so if we consider that it is the work of a single person - a Reddit user named 'deepfakes' - not a huge studio capable of re-creating a young Princess Leia in Rogue One with CGI help. The user, by the way, uses open source machine learning tools such as TensorFlow, made available by Google for free to researchers, academics and anyone interested in machine learning. Like that Adobe tool that makes people talk about anything and the Face2Face algorithm able to change a recorded video with facial monitoring in real time, this new model of fake pornography shows that we are about to live in a world where it will be ridiculously simple to create believable videos of people doing and talking things they never actually did. Including sex.


Until then, deepfakes posted pornographic videos with the faces of Scarlett Johansson, Maisie Williams, Taylor Swift, Aubrey Plaza and Gal Gadot on Reddit. I contacted the representatives and agencies of these actresses, reporting on the fake videos and will update the story if they respond. Forged celebrity pornography in which images are edited to resemble celebrities without clothing is no longer new. His fans are many. The deepfakes itself has big fans of their work. He is, after all, the pioneer.


According to deepfakes - who did not want to reveal their identity to avoid public scrutiny - the software builds on several open source libraries, such as Keras with TensorFlow backend. To compile the faces of celebrities, deepfakes claims to have used such sources as the search for Google images, banks of images and videos on YouTube. Deep learning consists of networks of connected points that rotate processes on top of the data that has been inserted. In this case, he trained the algorithm with pornographic videos and the face of Gal Gadot and, after this "training", the dots organized themselves to complete a specific task like manipulating video in real time.

Excerpt from the full video, hosted on SendVids, with the face of Gal Gadot on the body of a pornographic actress.

Artificial intelligence researcher Alex Champandard told me via e-mail that a good video card, accessible to the average consumer, could process this type of effect in a few hours, but even an ordinary processor without a video card could do the same , albeit more slowly, over days. "It's not that complicated anymore," Champandard said. The ease with which you can do something of the kind scares: leaving aside all the technique involved, only some of your photos would be necessary, things that a lot of us have already provided a good one to help create huge databases of our own faces. (Remember: 24 billion selfies have been posted to Google's photo service between 2015 and 2016.



In an e-mail conversation, Deepfakes commented that he was not a professional researcher in the field, but rather a programmer interested in machine learning. "I just found a clever way to do face-shifting," he says of his algorithm. "With hundreds of face images, I can easily generate millions of distorted images to train the network. After that, I insert someone else's face into the net and they will find it to be another distorted image, then try to make it look like the face they used in training. "



In posting on Reddit, Deepfakes mentioned using an algorithm similar to that developed by researchers at Nvidia that uses deep learning to, for example, turn a summer scene into a winter scene quickly. The researchers, in turn, declined to comment on this possible application. The results are far from perfect in the examples posted by deepfakes. In Gadot's video, for example, there are times when a box pops up around your face where you can see the original image, just as your eyes and mouth do not match what the actress is talking about - but if you're willing to believe, could very well be Gadot. 
In other videos, the work of deepfakes is even more compelling. Pornographic actress Grace Evangeline told me via direct message on Twitter that actresses in the business are accustomed to having their work distributed free on sites such as SendVid, where the video of Gadot is without her permission, but this time it was something different, something she had never seen. "Something important that always has to be present is consent," Evangeline said. "Consent in private life and also in films. Creating fake sex scenes with celebrities takes away the right to consent, it's wrong. "Even for those who live in front of the cameras, violation of personal boundaries is problematic. I showed the video to Alia Janine, a pornographic actress retired 15 years ago. "It's disturbing," he commented on the phone.




I asked deepfakes if he had already stopped to reflect on the ethical implications of his technology if he had thought of issues such as consent, revenge porn and blackmail when creating this algorithm. Here's the answer: "Any technology can be used with bad motivations, it's impossible to prevent it," he commented, comparing his work to that of technology that recreated Paul Walker after his death in the movie Fast and Furious 7. "The main difference is how it's easy for anyone to do that. I do not think it's bad for ordinary people to get started with machine learning research. "In ethical terms, the implications are" huge, "according to Champandard. The malicious use of technology can not be avoided, but rebuffed in some way.


"We need to have a public debate on the subject loud and clear," he said. "Everyone needs to know how easy it is to create fake videos and images, to the point that it will be difficult to determine what is false or not in a few months. Of course, all of this has been possible for a long time, but it would take a lot of resources and professionals involved, and today everything can be done by a single programmer with a relatively new computer. "Champandard said researchers may already start developing technology for video detection false, in order to help determine what is real or not. Internet regulations can be modified to moderate what happens when such forged material and related abuses arise. "Quite weird, it's a good thing," Champandard said.


Source:  Follow VICE Brasil

Комментариев нет:

Отправить комментарий