Deepfake pornography targets celebrities
Deepfakes first emerged on the internet in the form of non-consensual pornography at the end of 2017. Since then, an entire deepfake porn ecosystem has developed online.
It is an entirely gendered phenomenon with female celebrities becoming early targets (victims included Ariana Grande, Daisy Ridley, Emma Watson, Katy Perry, Jennifer Lawrence, Cara Delevigne, Sophie Turner, Maisie Williams and Gal Gadot.) Deepfake porn victim, Kristen Bell, highlighted major concerns about consent - regardless of video watermarks and called for the internet to be more responsible.
Now ‘normal’ women and minors are being targeted in deepfake pornography too. Whilst abundant online, we refrain from bringing attention to any specific case studies of this malicious form of synthetic content.
With a malicious intent to harm, there are calls for deepfake non-consensual porn and revenge porn to have criminal charges for those who create, share and publish these types of videos across the globe.
Recommendations to criminalise revenge porn and cyber flashing have been suggested as means to crack down. In the US, it is illegal in the states of Virginia and California, whilst Scotland has made it illegal to distribute deepfake porn. They could also potentially be covered by a EU code of practice, whereby regulators can fine technology companies up to 6% of their global turnover if they don’t crack down on deepfakes.
Companies including Twitter, Reddit and PornHub have banned deepfake porn, but despite this, new videos continue to emerge and unflagged videos remain on Reddit and PornHub.
DISCLAIMER: To protect victims of deepfake pornography we won’t be including all cases. We will reference only a few to highlight the damage they can cause. We do not want to encourage people to use this technology with criminal intent. We believe consent is key.
Year
2017