(Image credit- TheNextWeb)
According to a new study, deepfake video snippets of movie remakes that don’t actually exist led half of the subjects to recall the movies incorrectly. It was also found that similar rates of false recollection were induced by simple text descriptions of these fake movies.
Gillian Murphy from University College Cork in Ireland and associates from Lero, the Science Foundation Ireland Research Centre for Software, carried out the study.
Deepfake Remakes Movies
Artificial intelligence (AI)-based deepfake videos are edited or manipulated videos. In these videos, the face or voice of a person from an existing video is swapped out with another person, creating frequently incredibly lifelike and convincing visual and auditory effects.
Murphy and her team invited 436 participants to complete an online poll in order to explore the possible hazards and advantages of Deepfake technology.
In the poll, participants were asked to watch deepfake videos of made-up film remakes with new cast members, such as Will Smith taking over for Keanu Reeves as Neo in “The Matrix” and Brad Pitt and Angelina Jolie in “The Shining.”
It’s interesting to note that some participants even shared memories of the phony remakes being better than the original films.
Deepfake: Can it Change Memory?
Although the false memory rates from the text descriptions were comparable, this suggests that Deepfake technology may not have an advantage over other techniques or tools when it comes to memory manipulation.
The study also showed that the majority of individuals were uneasy about the prospect of recasting movies using Deepfake technology. There have been complaints regarding possible violations of artistic integrity and the disturbance of communal social experiences related to movies.
These discoveries have significance for the advancement and control of Deepfake technology in the motion picture sector. The study showed that Deepfake is not fundamentally more effective at altering memories than other non-technical methods, the researchers said, adding that while they raise issues for a variety of reasons, such as non-consensual pornography and bullying, they are not the only cause for concern.
Deepfake technology analyzes and maps the target person’s facial expressions and speech patterns onto the source footage using machine learning techniques and deep neural networks. This method enables the production of videos with manufactured material but with the appearance of authenticity.
Due to their potential for misuse and the ethical issues they bring, these videos have drawn attention. They can be used to produce phony videos that purport to feature people speaking or performing things they have never actually done.