Search TorNews

Find cybersecurity news, guides, and research articles

Popular searches:

Home » News » Cyber Threats » Google Chrome Reportedly Downloads 4GB AI Model Without User Consent

Google Chrome Reportedly Downloads 4GB AI Model Without User Consent

Last updated:May 6, 2026
Human Written
  • Google Chrome quietly downloads Gemini Nano, a 4GB AI model, into people’s devices without asking permission.

  • A top expert says this silent install might break privacy laws as well as harm the environment.

  • The download happens automatically when Chrome’s built‑in AI features turn on by default.

Google Chrome Reportedly Downloads 4GB AI Model Without User Consent

Many love Chrome because it’s fast and has great features. But recent findings suggest the browser is pulling sneaky tricks on users.

According to the report, Google is stuffing a huge AI file onto people’s devices. And it doesn’t ask them first.

A File you Never Said yes to

Alexander Hanff, a well‑known computer scientist and lawyer, caught Chrome red‑handed. He says the browser is reaching into users’ devices to write an on‑device AI model file as large as 4GB to disk without asking.”

The file has a plain name: weights.bin. You will find it inside a folder called OptGuideOnDeviceModel. In simple terms, this file holds the brain of Gemini Nano. That is Google’s small AI model that runs directly on your device.

Here’s where it gets weird: Chrome shows no pop‑up or checkbox. There is no setting that says “download a 4GB AI model.” Nothing asks for your consent. The download just starts.

When does it happen? Simple, whenever Chrome’s AI features are active. Recently released Chrome versions have those features enabled by default. No need to click to turn them on, as they are already active right from the time you installed Chrome on your computer.

Hanff isn’t happy about this. He’s actually furious. He noted that Chrome didn’t ask, nor did it surface it. Even when the user deletes it, Chrome goes ahead and re‑downloads it.

He believes an engineering team at Google decided that your computer is just a tool for them. “Not a personal device whose owner is the legal authority on what runs there,” he adds.

The Environmental Implication

The download itself is fast. Hanff timed it from start to finish. Getting that whole thing done, making the folder, putting the file in there, only takes fourteen minutes and twenty-eight seconds.  You do nothing. You might only notice months later when your hard drive runs out of space.

Removing the file is a different story. Deleting it takes multiple steps. Google never wrote any of those steps in Chrome’s help pages. The browser does not even hint that the file exists.

But the biggest problem might be the environment. Think about this: about two billion people use Chrome. So pushing a 4GB file to that many machines creates a massive carbon footprint.

Hanff did the math. Turned out that the CO2 emissions for this one model push land somewhere between 6,000 and 60,000 tonnes, which is equivalent to burning millions of pounds of coal. That’s the price the environment pays because one company decided on its own to force a 4 GB binary on two billion people they did not ask for.

This is not Hanff’s first fight. A couple of weeks ago, he accused Anthropic of secretly installing spyware through its Claude Desktop app. He said that the move broke EU privacy laws and computer misuse rules. Now he sees a similar pattern with Google, only on a much larger scale.

A Better Approach for Google

Hanff believes that only one thing would have fixed this issue. Google simply needs to ask its Chrome users. They could display one pop-up like this: “Chrome is downloading a 4GB AI Model File to Your Device for Use with the Following Features, Accept or Delay Acceptance Until Later.”

This is all that users need; A choice that is on the table rather than an unknowing or obligated download.

Google’s relationship with user consent is complex. On one hand, it pushes AI files without permission; on the other, it actively fights scammers and hackers in court. In a recent legal move, the company sued Chinese hackers accused of running fraudulent operations. Learn more, Google sues Chinese hackers, pushes laws in scam fight.

What now? Hanff provides a way to test what Google’s next move will be. If, in the next version of Chrome, Google removes these unsolicited downloads or installs and implements an initial consent feature for users, we will know that Google is committed to responsible AI and sustainability.

If that doesn’t happen, however, we then know that Google does not care about responsible AI and sustainability. Perhaps you currently have a “guest” of 4GB on your computer, you didn’t personally invite to stay.

Share this article

About the Author

Memchick E

Memchick E

Digital Privacy Journalist

Memchick is a digital privacy journalist who investigates how technology and policy impact personal freedom. Her work explores surveillance capitalism, encryption laws, and the real-world consequences of data leaks. She is driven by a mission to demystify digital rights and empower readers with the knowledge to protect their anonymity online.

View all posts by Memchick E >
Comments (0)

No comments.