Shimla
Artificial Intelligence has made people’s lives much easier, but this technology is also being misused. In recent times, a new and dangerous method of fraud has emerged, which is called Deepfake Audio. In this, cyber thugs are cheating by imitating your voice.
Deepfake audio is a high-tech process in which cyber fraudsters copy the voice of any of your loved ones, relatives or friends. For this, they collect many samples of your voice and then with the help of machine learning technology, they prepare an exact copy of it. After this, they use this fake voice to deceive you.
How does cheating happen?
Recently a case has come to light, in which cyber thugs victimized a principal. The swindler made a call to the principal and informed that his relative had been arrested by the police in UAE and a demand of Rs 1.5 lakh was being made for his release. The voice of the person calling was that of his relative, hearing which the principal was convinced that it was true. On this, he transferred the money to the account given by the swindler.
Later, when he called his relative, he came to know that he was sitting near him. After this incident, he informed the cyber crime cell about the matter and appealed to the people to avoid such frauds.
DIG Cyber Crime Mohit Chawla said that people should always be alert to avoid this type of fraud. If there is any doubt, immediately seek help from experts before transferring money. Cyber Crime Cell has experts available to investigate such cases, who are always ready to help you