Developments in the field of artificial intelligence have led to the emergence of a new form of fraud, according to the Washington Post report, criminals use the vote -based vote to make money to earn money by imitating the voices of their loved ones. Cut people and send them voice messages that complain that they are in crisis and need money urgently.
The number of deceptive fraud in the United States is scheduled to increase dramatically in 2022, with more than 36,000 reports of two frauds, more than 5,000 of which were phone fraud, according to Feder Trede. The United States Committee is responsible for more than 11 % of the losses. dollar.
The newspaper stated that one of the victims received a call from a man who believes his grandson saying that he is in prison without a portfolio or a mobile phone and needs money to release a guarantee. The victim and her husband pulled 3000 yuan and were about to withdraw money when the bank manager called them and told them that another customer received a similar call, only to find that the voice of their similar grandson was fake. Then they realize that they were victims of a fraud.
This artificial intelligence joke is more convincing and effective for criminals because someone has a sample of human voice that can use the text tools into words to create fake emergency calls, which artificial intelligence can then enter into the programs created to implement. It resembles the victim's knowledge. These tools analyze short sound samples from sources like YouTube, Instagram or Facebook. These tools are rebuilding sounds based on the unique factors for each individual, such as age, gender, dialect and tone, to create a voice similar to the original voice.
The newspaper said that the victims, who were often elderly and living alone, felt fear and fear of their loved ones and were frustrated when they heard voices similar to family members. dangerous situation.
These tools do not provide evidence to determine the perpetrators of these crimes. According to experts, the official authorities charged with investigating these crimes are not well equipped to deal with them, and it is difficult to track the calls of fraudsters and their money all over the world.
The document adds that it is difficult to impose legal penalties on companies that provide sound generation programs for artificial intelligence because there is no precedent in this field, which means that the victims do not have a great opportunity to determine officials and take legal measures against them to start
Experts believe that the problem will become more complicated with the development of technology and artificial intelligence, and it is still unclear how much authorities are able to face this increasing challenge.
The authorities advise people to make sure not to respond to the requests of family members and try to contact them through other means of communication to verify the validity of the request. Payments should not be made using unsafe methods such as gift cards, and incorrect requests must be handled to obtain money with caution.
At the same time, experts stressed the importance of holding the makers of these tools legal responsibility for their use in fraud crimes, which requires changes in the current legal system to better define responsibility and organize the industry.