Though deepfakes have been weaponized most often against unconsenting women, most headlines and political fear of them have focused on their fake news potential.Įven bills like the DEEPFAKES Accountability Act, introduced earlier this month, aren't enough to stop this technology from hurting real people. In the meantime, women are victimized by deepfakes and left behind for a more political, US-centric political narrative. And, our legislators are going to have to think about how to thoughtfully regulate in this space."ĭeepfakes have become a widespread, international phenomenon, but platform moderation and legislation so far has failed to keep up with this fast-moving technology. "In addition, social media platforms are going to have to think more carefully about how to define and enforce rules surrounding this content. "We are going to have to get better at detecting deepfakes, and academics and researchers are going to have to think more critically about how to better safeguard their technological advances so that they do not get weaponized and used in unintended and harmful ways," Farid said. Farid was shocked at this development, and the ease at which it can be done. Motherboard showed the DeepNude application to Hany Farid, a computer-science professor at UC Berkeley who has become a widely-cited expert on the digital forensics of deepfakes. When we fed it an image of the cartoon character Jessica Rabbit, it distorted and destroyed the image altogether, throwing stray nipples into a blob of a figure. DeepNude failed entirely with some photographs that used weird angles, lighting, or clothing that seem to throw off the neural network it uses. Most images, and low-resolution images especially, produced some visual artifacts. The algorithm accurately fills in details where clothing used to be, angles of the breasts beneath the clothing, nipples, and shadows.īut it's not flawless. The results vary dramatically, but when fed a well lit, high resolution image of a woman in a bikini facing the camera directly, the fake nude images are passably realistic. Motherboard tested it on more than a dozen images of women and men, in varying states of dress-fully clothed to string bikinis-and a variety of skin tones. (Cropping out the "fake" stamp or removing it with Photoshop would be very easy.) In a paid version, which costs $50, the watermark is removed, but a stamp that says "FAKE" is placed in the upper-left corner. In the free version of the app, the output images are partially covered with a large watermark. It installed and launched like any other Windows application and didn't require technical expertise to use. Motherboard downloaded the application and tested it on a Windows machine. ![]() “As a deepfake victim said to me-it felt like thousands saw her naked, she felt her body wasn’t her own anymore.”ĭeepNude launched as a website that shows a sample of how the software works and downloadable Windows and Linux application on June 23. ![]() “Yes, it isn’t your actual vagina, but… others think that they are seeing you naked,” she said. This is an “invasion of sexual privacy,” Danielle Citron, professor of law at the University of Maryland Carey School of Law, who recently testified to Congress about the deepfake threat, told Motherboard. This tech should not be available to the public." ![]() "Now anyone could find themselves a victim of revenge porn, without ever having taken a nude photo. "This is absolutely terrifying," Katelyn Bowden, founder and CEO of revenge porn activism organization Badass, told Motherboard. DeepNude also dispenses with the idea that this technology can be used for anything other than claiming ownership over women’s bodies. DeepNude is an evolution of that technology that is easier to use and faster to create than deepfakes. But the most devastating use of deepfakes has always been in how they're used against women: whether to experiment with the technology using images without women's consent, or maliciously spreading nonconsensual porn on the internet. Since Motherboard discovered deepfakes in late 2017, the media and politicians focused on the dangers they pose as a disinformation tool.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |