11 hours ago
- Copy link
Actress and BJP MP Hema Malini raised the issue of Deepfek technology targeting famous celebrities in the Lok Sabha on 27 March. He also talked about the impact on mental health by trolling with the disadvantage of AI and Deepfec.

Raising the issue during the zero hour, Hema said that Artificial Intelligence and Deepfek Technology have engulfed the world. Although this technique has many advantages, the film is also targeting the famous celebrities associated with the industry. These famous celebrities have worked hard to get name, fame and popularity. Many of us have fallen victim to this misuse.
Hema said, ‘They become viral and have a bad effect on Mental Health of Victim. It cannot be taken lightly. ‘
Apart from this, he also raised the issue of brutal comment on the personal life of celebrity on social media. He said that the words of celebs are often presented.
Star who hunted deepfek
Let me tell you that Bollywood are many such stars who have fallen victim to Defec. In November of 2024, a deep -size video of Rashmika Mandana went viral on social media. Deepfek was made using Rashika’s face in the body of an influencer named Zara Patel.
After Rashmika, a video of Kajol also went viral, in which she was seen changing clothes. The fact check revealed that it was a deep -fet video, which was created by an influencer in June.
Alia Bhatt has been a victim of deepfack twice. In Deepfeek video, Alia Bhatt was shown to be ready in Black Kurta. In the entire clip, she was seen doing makeup in front of the camera.

Apart from this, actress Nora Fatehi, actor Ranveer Singh, Aamir Khan, Sonu Sood, cricketer Sachin Tendulkar have also become victims of deepfeek technology.
What is deepfack and how is it made?
The word Deepfee was first used in 2017. At that time, videos of many celebrities were posted on the US Social News Agriculture Reddit from Deepfek ID. It had many porn videos of actress Emma Watson, Gal Gadot, Scarlett Johansson.
In a real video, photo or audio, fitting to fit another’s face, voice and expression has been named Dipphak. This happens so cleanly that anyone believes. The fake in it also looks like real.
It resorts to machine learning and artificial intelligence. In this, video and audio is made with the help of technology and software.
AI and cyber expert Puneet Pandey explains that now ready to use technology and package are available. Now anyone can use it. The voice has also become improved in the current technology. Voice cloning has become very dangerous in this.