Introduction
We have talked in previous alerts about the increased conflict between AI applications and data protection. It is clear that AI is on the agenda for a number of data protection regulators and the latest action in Italy over the Replika chatbot is further evidence of that trend.
What happened?
Replika is an AI powered chatbot which generates a virtual friend using text and video interfaces. It has its HQ in the US but Replika was available to users throughout the EU including in Italy. The promoters of Replika said that the chatbot could improve user’s emotional wellbeing and help them manage stress.
The Italian Data Protection Authority, the Garante, was concerned that there was no age verification element with the chatbot and that the sign-up process simply involved asking a user’s name, email account and gender. They were concerned that some of the reviews on the two main app stores included comments by users flagging sexually inappropriate content.
What is Replika?
Replika calls itself “the AI companion who cares”. Its promotors says it is “always here to listen and talk. Always on your side.” Replika says that it has millions of users who want a friend “with no judgment, drama or social anxiety involved”.
Replika seems to be owned by Luka Inc., a software development company based in San Francisco. According to Crunchbase, Luka was founded in 2015 and has so far raised more than $10m to fund apps “powered by a proprietary deep learning architecture”. Luka was founded by two Russians, Eugenia Kuyda and Philip Dudchuk. Whilst this does not seem to have featured in the Italian investigation there have been concerns that Luka has also used Replika to broadcast Russian propaganda messages. Researchers have included screenshots of chats where Replika seems to say that it collects information for the Russian authorities. Currently Replika seems to have around 10 million users bringing Luka an estimated $1m per month in download upgrade fees.
What did the Garante decide?
The Garante decided that Replika and its developer, Luka, were in breach of the transparency requirements in GDPR. They also said that Replika processed data unlawfully as the performance of a contract could not be a legal basis for processing data given that children were not able to enter into a valid contract under Italian law.
The Garante ordered that Luka should terminate the processing of data relating to Italian users and tell Garante of the measures it had taken within 20 days (i.e. by 22 February 2023). If it does not do so, Garante says that it may take further action which could be a fine of up to €20m or 4% of the company’s total worldwide annual turnover.
The Garante was particularly concerned about the effect on children. It said “utterly inappropriate replies are served to children having regard to their degree of development and self-conscience.”
Other Cases
It is fair to say that the Garante has form when it comes to regulating AI. The Garante was one of a number of DPAs who took action against Clearview AI – https://www.corderycompliance.com/clearview-ai-italy-gdpr-fine/. In addition, the Garante has previously brought proceedings relating to the use of AI in food delivery – https://www.corderycompliance.com/garante-fines-deliveroo/.
What about ChatGPT?
Many people will be following developments in Italy to see if action against ChatGPT might be next. There have been concerns about the transparency of the ChatGPT application including allegations that some of the information it has provided has been inaccurate, for example in connection with the Elon Musk acquisition of Twitter. We have talked in more detail about these issues in our new film here https://bit.ly/chatgptfilm.
More Information
There is more information about this and other data protection topics in Cordery’s GDPR Navigator subscription service. GDPR Navigator includes short films, straightforward guidance, checklists and regular conference calls to help you comply. More details are at www.bit.ly/gdprnav.
You can read the Garante’s decision here https://bit.ly/3HJaVJw.
For more information please contact Jonathan Armstrong or André Bywater who are lawyers with Cordery in London where their focus is on compliance issues.
Jonathan Armstrong, Cordery, Lexis House, 30 Farringdon Street, London, EC4A 4HH | André Bywater, Cordery, Lexis House, 30 Farringdon Street, London, EC4A 4HH | |
Office: +44 (0)207 075 1784 | Office: +44 (0)207 075 1785 | |
Jonathan.armstrong@corderycompliance.com | Andre.bywater@corderycompliance.com | |
![]() |
![]() |