Beware of artificial intelligence extensions: which is the only one that doesn't collect personal data?

A study by University College London revealed that the most popular artificial intelligence (AI) browser add-ons store personal information, from browsing history to medical or banking data . But there is one exception.
The researchers emphasized that these practices occur silently, without the user having any way of noticing them when installing the extensions . Although terms and conditions often exist, they are often worded in an opaque manner and do not clearly reflect the scope of the data collection.
The report adds to a series of warnings from cybersecurity and data protection specialists in Europe and the United States: artificial intelligence, in its quest to offer more "contextual intelligence," ends up needing to absorb information about users' online activities, which opens the door to opaque uses .
Browser extensions are widely used. Photo: Shutterstock
Artificial intelligence extensions for Chrome, Edge, and other browsers have grown in popularity in recent months. Tools like ChatGPT, Copilot, Merlin, and Monica promise search assistance, automatic summaries, and immediate responses.
It's worth remembering that extensions are small programs installed in your browser to add extra features: from blocking ads to translating pages or, in this case, incorporating artificial intelligence directly into your browsing experience .
In order to operate, they often require broad access permissions to the content you see on your screen, and it's precisely in this area where the greatest privacy concerns arise.
However, what is billed as a productivity benefit comes with an invisible cost: the loss of control over privacy. To function, many of these extensions request broad permissions from the browser and then make extensive use of that permission, accessing more information than necessary to fulfill their primary function.
According to experts, this means that someone simply trying to summarize an academic text or translate a document could be unwittingly exposing everything they have open in their browser: from personal emails to banking sessions to medical records.
The ease with which these extensions are installed also contributes to the problem. Unlike traditional apps, browser add-ons don't always go through strict auditing processes in official stores, leaving room for abusive data handling practices to flourish.
DeepSeek shook up the AI world with the launch of its low-cost system. Photo: New York Times
Among the most notable cases, the Merlin extension even captured data from online forms, such as financial or health credentials. Sider and TinaMind, meanwhile, shared user queries and IP addresses with external platforms like Google Analytics, enabling cross-site ad tracking.
The investigation also revealed that some of these add-ons combined sensitive information with persistent identifiers. This means that the data could be associated with the same user profile over time, multiplying the risk of exposure in the event of a leak or sale to third parties.
ChatGPT for Chrome, Microsoft's Copilot, and Monica were flagged for recording attributes such as age, gender, income, and interests, data they then used to personalize responses across different browsing sessions. According to researchers, this practice shows how the logic of targeted advertising is beginning to infiltrate the AI ecosystem.
“These assistants offer unprecedented access to users' online behavior in areas of their lives that should remain private,” explained Anna Maria Mandalari, senior research fellow at UCL.
The specialist warned that the normalization of these practices could end up eroding public trust in AI tools, even those that comply with privacy regulations.
Perplexity, the AI-powered search engine. Photo: Reuters
The review identified only one case with no evidence of abusive data collection: the Perplexity AI extension. According to the report, this service did not display any practices involving the transmission of sensitive information or hidden user tracking.
The researchers noted that, at least in their tests, Perplexity simply provided search results without capturing more information than strictly necessary.
This doesn't mean, however, that Perplexity is immune to criticism. Some experts point out that the fact that abusive practices were not detected in this specific study doesn't guarantee that they won't appear in the future, given that tech companies' privacy policies and business models tend to change rapidly.
For Mandalari, the problem goes beyond targeted advertising: "Once the information is collected, we don't know where it will end up or if it will end up in the hands of illicit networks that will use our credentials to commit crimes."
In his opinion, the lack of transparency and technical complexity make it almost impossible for an average user to understand what happens to their data in these environments.
The study brings the debate on digital privacy and transparency back to the forefront, in a context where more and more AI tools are being integrated into everyday life at a difficult-to-measure price: our personal data.
For researchers, the solution lies in stricter regulations, independent audits, and, above all, clearly informing users about the risks of the technologies they adopt.
Clarin