Artificial Intelligence

DNA exclusive: How AI-based deepfake video edits pose a threat


New Delhi: In DNA on Wednesday, Zee News Editor-in-Chief Sudhir Chaudhary did an analysis of the Artificial-Intelligence (AI) based deepfake videos. With the help of this technique, anyone can make a fake video in a matter of few hours which can be used to smear anyone or to discredit an individual.

The most popular fake videos are of famous Hollywood superstar Tom Cruise which are quite viral on social media. These videos can confuse anyone and this is the reason why this technology is being discussed all over the world now.

This is deepfake technology, which means real photos or videos are manipulated with the help of Artificial Intelligence, turning them into fake photos and videos for a specific purpose. The images are so good that it becomes difficult to distinguish between real and fake. This is why it is called ‘deepfake technique’.

This technique is very dangerous like ‘nuclear bomb’ because if Tom Cruise can be tagetted like this then it is possible that such a video can be made on anyone else.

It may be that the deepfake video could be made on Prime Minister Narendra Modi falsely claiming that he has resigned and people may also end up believing it. It can also happen to big celebrities, and other ministers of any country, with big industrialists and common people too.

Such deepfake videos are often spread with the help of social media And in the forefront is FaceBook, but ironically FaceBook founder Mark Zuckerberg himself has fallen prey to deepfake.

Some time ago a fake video of his went viral, not only this it has happened with former US President Barack Obama. 

This technology can become a bigger challenge for the world than terrorism and to understand this lets look at some figures.

Nearly 180 crore photos are uploaded on social media all over the world every day i.e. the number of photos uploaded on the internet in a week is equal to the population of the world at this time.

Several crores of these pictures are in the form of selfies and the number of pictures of women are more than that of men. If Cyber ​​criminals get their hands with the help of deep fake tehnology, these pictures can be easily converted into pornographic images. 

As per a study conducted in 2020 by a company that found the fake content on the Internet, around 1 lakh photos put on social media by women have been converted into pornographic images with the help of this deepfake technique.

Globally, 466 crore people use the Internet and 260 crore of them are connected with some software or the other. That is, you can say that this technology is really like a dangerous missile, and its target is the people who use the Internet.



Source link

Spread the love

Leave a Reply

Your email address will not be published. Required fields are marked *