deepfake-equipo-e-1300x650

DeepFake, as is the technology used for the "Team E" video

The viral of the moment is the "Team E" video, a parody of the eighties series of "Team A", where the leaders of political parties in Spain they occupy the places of the protagonists. And thanks to technology Deepfire He has managed to place his face in real scenes of the series.

The video has been uploaded on the FaceToFake channel on YouTube, and has been viralized on social networks and WhatsApp not only because of its fun approach, but because it is critical of the politicians after the repetition of the elections in 10-N, whose results do not change the prospects of government formation too much.

In the trailer of "Team E, with E from Spain" we see the face of Pablo Casado placed in Hannibal's body, Pedro Snchez located in the scenes of Fnix, Albert Rivera plays Murdock and Santiago Abascal de M.A. (or Mr. T), all thanks to the use of DeepFake.

The most hilarious role is for Pablo Iglesias. Since the original Team A only has four members, its face is placed on the body of the character of Amanda Allen, a journalist who accompanies the fortune soldiers of the American television series on some of her adventures.

Making a quality assembly like "Team E" based on special effects will be somewhat out of reach of a YouTube channel, but the DeepFake algorithms accelerate and automate work surprisingly.

Using systems intelligent vision and artificial intelligence, the DeepFake algorithm starts from a person’s photo, and is capable of place your face in a video on the body of another. Also, adjust movements, facial expressions and even mouth gestures when speaking.

The first use for DeepFake that was recorded was to modify pornographic videos adding the face of Hollywood actresses, with which the technology was born marked by politics.

It will soon be clear that it will be used for dangerous purposes, such as create fake videos to damage the reputation from other persons. Such a convincing video could be a political weapon, especially if we consider that fake news has influenced recent elections in several countries.

Obviously, a DeepFake can be used for well-meaning purposes, and the humorous video of Team E is proof of that.

Although most of the projects rely on the same software to create the assemblies, today DeepFakes systems are perfectly documented and your code is accessible to all.

Therefore, it is inevitable that they will be extended little by little, that different developers design new, more perfect algorithms and also that techniques appear to check if a video is real or a DeepFake.

In the case of Team E it is not intended to pass the clip as authentic, nor will its quality make it possible, but the development of new techniques make DeepFakes a problem Looking ahead.

We have already seen apps like Zao, which exchanges faces automatically, so this phenomenon will soon be popularized. The assembly of Team E is an example of harmless DeepFake, but we fear that many will not be.

DeepNude, the polmica app that creates any woman's nudes