Snapchat could face a multi-million pound fine after Britain's data regulator issued an interim notice for failing to assess the privacy risks that its My AI chatbot could pose to consumers, especially children.
The Information Commissioner's Office said it initially found that the owners of the social media app failed to properly identify and assess the risks to millions of My AI users in the UK, including those aged 13 to 17 years.
Snapchat has 21 million monthly active users in the UK and is particularly popular among young people. Market research firm Insider Intelligence estimates that 48% of users are 24 or younger. About 18% of users in the UK are between 12 and 17 years old.
Information Commissioner John Edwards said: “The initial findings of our investigation indicate that Snapchat did not adequately identify and assess privacy risks to children and other users before launching My AI.”
The Information Commissioner's Office said its findings were preliminary and Snapchat had until October 27 to submit its comments before making a final decision on whether to take action. Any violation of data protection laws may result in a report. "Play later." "
If a final enforcement notice is issued, Snapchat may have to stop processing data related to its My AI chatbot, meaning UK customers will not be able to use Snapchat until the company conducts a proper risk assessment of the service.
The Data Protection Commissioner's priority is to prevent potential data breaches and ensure My AI's compliance, although it also has the power to impose fines of up to 4% of total global sales. Last year, Snapchat generated global revenue of $4.6 billion.
GPT's technology is based on chatbot My AI, the launch of which is the first example of embedded AI integrated into a major messaging platform in the UK.
Back in February, Snapchat introduced the My AI feature to users of its subscription service Snapchat+ and rolled it out to all users in April.