Top Class Actions’s website and social media posts use affiliate links. If you make a purchase using such links, we may receive a commission, but it will not result in any additional charges to you. Please review our Affiliate Link Disclosure for more information.
Snapchat AI data protection investigation overview:
- Who: The U.K. Information Commissioner’s Office issued a preliminary enforcement notice to Snap Inc. over potential privacy risks associated with its “MyAI’ chatbot on its Snapchat app.
- Why: The commission claims Snap failed to adequately assess data protection risks associated with the MyAI chatbot prior to unveiling it to the public.
- Where: Snapchat is used by consumers across the UK.
The U.K. Information Commissioner’s Office (ICO) is concerned about the possible privacy risks associated with a new chatbot feature recently integrated into the Snapchat multimedia instant messaging app.
The chatbot — dubbed by Snap Inc. as “MyAI” — debuted earlier this year and allows users to have a simulated human conversation with a Snapchat “friend” that is powered by generative artificial intelligence.
The ICO, meanwhile, revealed last week that it has issued a preliminary enforcement notice to Snap, giving the company a chance to respond to its concerns about the potential privacy risks associated with its MyAI chatbot, reports Law360.
The preliminary notice is reportedly centered around Snap’s obligation under the General Data Protection Regulation to carry out a data protection assessment in the event the processing of user data could likely result in a “high risk to their rights and freedoms.”
Snap failed to adequately assess data protection risks associated with its ‘MyAI’ chatbot, says ICO report
The ICO sent the notice after it conducted an investigation that reportedly found Snap failed to adequately assess data protection risks associated with its chatbot during risk assessments conducted by the company prior to its unveiling.
The commission said it is especially concerned Snapchat did not adequately assess the privacy risks its MyAI chatbot creates for child users of the app between the age of 13 and 17, an age range the agency said makes up the majority of Snapchat’s users, reports Law360.
While Snap has reportedly not been accused of formally breaching any UK data protection laws, the company has been given steps that could potentially be mandated by the ICO in the event a final notice were to be adopted.
The ICO said that, in the event of a final notice, Snap may be required to stop processing data collected through its MyAI chatbot and would no longer be able to offer the feature to its users located in the UK until the company carried out an adequate risk assessment, reports Law360.
Snap previously agreed to pay $35 million in 2022 to resolve a class action lawsuit accusing the company of violating biometric privacy law with certain Snapchat features.
Are you concerned that Snapchat’s ‘MyAI’ chatbot could violate the privacy of its users? Let us know in the comments!
Don’t Miss Out!
Check out our list of Class Action Lawsuits and Class Action Settlements you may qualify to join!
Read About More Class Action Lawsuits & Class Action Settlements:
- Ofcom recommends competition watchdog review Amazon, Microsoft cloud services
- New agreement allows UK businesses to transfer personal data to US companies that meet requirements
- Nissan to sell only electric vehicles in Europe by 2030
- UK government announces £600,000 in new compensation for every wrongfully convicted postmaster