AI Chatbots Like ChatGPT Can Seduce Victims Online Loot Bank Details by Scammers Details

[ad_1]

Chatbots equipped with artificial intelligence can prove to be a threat to humans, their privacy and their capital in the coming times. We are not saying this, but this is the indication of the latest report telling about the possibilities of scamming people by AI chatbots. AI chatbots like ChatGPT are capable of interacting with users with human-like emotions and reactions. Not only this, they have been designed in such a way that they learn on their own from interactions and are able to mold themselves according to the preferences of the users. What if such AI tools were used by scammers? How dangerous will this prove to be for the increasingly digital world? Richard De Vere, head of social engineering at tech solutions firm Ultima, has spoken at length on this topic.

Online scams such as romance scams are difficult to detect in advance, as the entire process of establishing a good rapport and trust with someone takes several weeks. But if the victim is trapped, then the process of extorting money from them begins using emotional methods. Now, what if this work is done by scammers with the help of AI?

Richard De Vere from The Sun on the subject talked to And added, “In the US alone … approximately 24,000 people fall victim to romance scams each year. Using AI allows scammers to automate a lot of their mundane tasks.”

De Vere says that artificial intelligence can talk to its victim about the weather, gain their trust by telling the victim about their family or how their day went in a captivating way. “The new generation of AI is nearly indistinguishable from humans, at least when communicating via email and messaging applications,” he said.

De Vere feels that some AI chatbots are capable of writing better and more reliable messages than humans and he cited ChatGPT as a direct example. tell that chatgpt a chatbot Which answers the questions you ask, as if you had asked that question to a human.

He told the publication, “Currently, scammers use of chatgpt You can interact with your target. This opens up another opportunity for less skilled criminals to increase the amount of activity.” He added, “When the target is sufficiently prepared, [भरोसा कर लेता है] gets and for ai [अपनी] develops feelings, a real person can take control and force the victim to send money.”

[ad_2]

Source link