The Spone case: what should we do about deepfake?
Have you ever laughed at those videos that take the image of a famous person and make them do the funniest things? Indeed, it can be hilarious to see Queen Elizabeth dancing on an alternative Christmas Message, but what if you were the direct target of these jokes? What if they were used to ruin your reputation? That is exactly what happened to three teenagers of Pennsylvania, a group of cheerleaders victims of cyberbullying. It all started when one of their teammates left their gym after having had some problems with these girls and her mother, Raffaella Spone, decided to take matters into her hands: she used deepfake to create compromising videos where the three teens were nude, drinking alcohol or vaping so that they would get in trouble with the team. Spone then purchased some phone numbers on Internet and started harassing the girls, even encouraging one of them to kill herself if she wanted that nightmare to stop. Finally, the cheerleaders told their parents what was going on and the woman was arrested. This is just an example of the many risks of a technology that is developing faster and faster! But what is it? And how should we deal with it?

Deepfake is a technique that uses machine learning and artificial intelligence to manipulate and create visual and audio contents with a high potential to deceive. Most of the visual ones follows a procedure where the real face of a person is exchanged with a fake image showing someone else: the first one is used as an input to a deep neural network, which creates a matching face of the other subject . At the beginning, it was used only with celebrities, but nowadays social media has made it simple to find and use pictures of common people, making everyone a possible target. Because of that, this technology can have a very strong impact in our society. First of all, we trust a lot on our eyes and deepfake has become so much believable that is actually difficult to distinguish the authentic from the artificial video. The cheerleaders of the Spone case, in fact, were afraid to talk at the beginning exactly because they thought no one would have listened to them since the videos looked so realistic. Secondly, while at the beginning it was not that easy to master this technique, now it is very accessible, making possible for anyone without a particular training to create those images even with free apps. Mrs. Spone was no big hacker, after all!
Although it can be employed for cybercrimes, we do not have to forget that as all tools deepfake can be used in a positive way as well. Indeed, there are a lot of business applications of this technique: for example, face and body swapping could be used to let consumers try on clothes, hairstyles, or cosmetics virtually; thanks to it, in film industry we would be able to put the actor’s face on the stuntman body or improve the dubbing. Therefore, the solution is not to simply ban it: once the door of progress is opened, we cannot close it so easily. Differently, we would risk letting it develop illegally and for the worst purposes! So, on the one hand it is necessary to assure legal protection to people who are victims of the dark side of deepfake. The legislator, in fact, must act to regulate the use of this new technology, encouraging its positive applications and punishing its harmful employment. On the other hand, it is essential to keep improving instruments able to detect deepfakes in order to expose them early.
Nevertheless, as much as technology and legal protection might help, they will not be enough. It is becoming more and more evident that we are living in a society where we often do not know what is real and what is not. Truth has always been a relative concept, but when rumours are not the only means of manipulation, when we cannot trust on our eyes and ears anymore, then the only thing left is our critical sense. Fake news, indeed, point out how our brain and visual system can be easily deceived, and this makes us feel terribly vulnerable. No one likes to make mistakes or to be wrong, so the idea of being surrounded by manipulated information affects the confidence we have in each other, leaving us to deal with confusion and uncertainty. Maybe the first step to face this new challenge is to accept that sometimes we can be fooled and sometimes we will never find out the truth. The important thing is to learn to question what we see or hear: we need people who are not content with accepting whatever news someone else gives them and in the way they tell them, but are smart enough to dig a little deeper. Put in front of this distorting mirror, the only weapon we have is to train our mind to not be deceived so easily.
Articolo a cura di: Laura Tondolo
NOTE
1. J. Kietzmann, L.W. Lee, I.P. McCarthy, T.C. Kietzmann, Deepfake: Trick or Treat?, in Business Horizons, 2019.
2. Ibid.