If your loved ones ask for a loan, let them know this news before giving it.

 




When something urgent comes up, if we don't have money in hand, we ask our close friends or relatives for a loan. Many people give money in this way if they believe that they will get back exactly. Someone lends money out of love for the person asking, even though they know it will not be repaid.

Getting the much-needed amount as a loan in this way becomes a great help to many people. However, a major challenge to such borrowing has now emerged from the technology sector. Artificial Intelligence or AI, which is being used for good and bad all over the world, is behind this challenge.

We don't have to ask friends directly to borrow money. Sometimes it can be through a call or WhatsApp or something. The current problem is that the calls and messages coming from friends will not be trusted in the future.

With the help of AI, people's voices can be imitated in the same way. Reports suggest that AI voice scams are on the rise, where people use AI to imitate human voices and extort money from friends and relatives. A scam like this happened the other day.

A 59-year-old woman from Punjab was a victim of AI voice fraud. One day at night, this woman received a call from Ananthiravan Prabhjot in Canada. "He is like my son-in-law and we speak the Punjabi we speak at home perfectly."

"The call came very late at night. He had an accident and was going to jail and asked for help with money. And the incident was told to be secret and not to be told to anyone," the woman who was cheated described the incident. After receiving the call, they paid the amount he demanded online.

After this, they realized that it was a scam when they received a real call from Ananthiravan in Canada. By then the money was gone. Here the fraudsters behaved with precise planning in such a way as to leave no room for doubt. A story of an emergency situation would be created and the plan would be executed precisely by imitating the relative's voice using AI.

Due to the 'emergency' atmosphere created by scammers, many people overlook the small flaws of these calls. Many people pay money thinking that the caller is a real relative or friend because they hear a familiar voice on the other end. Such scams have been reported before. As technology evolves, this scam will also become more advanced.

Prasad Patibandla, Director, Operations, Center for Research on Cyber ​​Intelligence and Digital Forensics (CRCIDF), Delhi, has reported the explanation of the complex pattern of such frauds. He says AI voice emulation tools can accurately mimic a person's voice.

"After capturing people's voices from social media or using data available in the public domain like sales calls, scams are imitating these voices with the help of AI. So beware of calls asking for money explaining an emergency."

Before transferring the money, the authorities suggest that the person calling is the real person and that what Ayar said is true. This fraud may lead to a situation in the future where no one will give you money if you ask for a loan. Because even making a video call will not help to confirm that the caller is the real person.








No comments