Firefox To Ship 'Network Partitioning' As a New Anti-Tracking Defense (zdnet.com) 65
An anonymous reader quotes a report from ZDNet: Firefox 85, scheduled to be released next month, in January 2021, will ship with a feature named Network Partitioning as a new form of anti-tracking protection. The feature is based on "Client-Side Storage Partitioning," a new standard currently being developed by the World Wide Web Consortium's Privacy Community Group. "Network Partitioning is highly technical, but to simplify it somewhat; your browser has many ways it can save data from websites, not just via cookies," privacy researcher Zach Edwards told ZDNet in an interview this week. "These other storage mechanisms include the HTTP cache, image cache, favicon cache, font cache, CORS-preflight cache, and a variety of other caches and storage mechanisms that can be used to track people across websites." Edwards says all these data storage systems are shared among websites.
The difference is that Network Partitioning will allow Firefox to save resources like the cache, favicons, CSS files, images, and more, on a per-website basis, rather than together, in the same pool. This makes it harder for websites and third-parties like ad and web analytics companies to track users since they can't probe for the presence of other sites' data in this shared pool. The Mozilla team expects [...] performance issues for sites loaded in Firefox, but it's willing to take the hit just to improve the privacy of its users.
The difference is that Network Partitioning will allow Firefox to save resources like the cache, favicons, CSS files, images, and more, on a per-website basis, rather than together, in the same pool. This makes it harder for websites and third-parties like ad and web analytics companies to track users since they can't probe for the presence of other sites' data in this shared pool. The Mozilla team expects [...] performance issues for sites loaded in Firefox, but it's willing to take the hit just to improve the privacy of its users.
Great Idea (Score:2)
Stop trying to figure out how each website is tracking you and simply segregate everything.
Re: (Score:1)
this makes it harder for websites and third-parties like ad and web analytics companies to track users since they can't probe for the presence of other sites' data in this shared pool.
Why are they able to do that at all? Why is your browser allowing websites to "probe your cache"? THAT is the real problem here.
Re: Great Idea (Score:5, Informative)
Uum, maybe you should first understand the basics before talking loudly.
When a page contains an image, and the client requests the page, but not the image, they assume it is cached. When it is requested, they do not. That is all that "probing" is.
The trick is to have those includes be unique for you, and not blocked by ad blockers. Which is not hard if you randomize the paths and code enough and do in in obfuscated scripts that ad blockers can't block without breaking the site. Essentially the same techniques as polymorphic viruses.
P.S.: (Score:1)
By the way, they used to probe your internal network the same way. Request https://yourrouter/ [yourrouter] internally, and if you just happen to be logged in in another tab, they'd be in too, and install an external login, and configure a MITM. Apart from many other fun things.
Re: (Score:2)
You don't even have to randomize the path for image caching. Just encode the unique identifier in the image pixels and tell the browser to cache it. Then use Canvas API to read the pixels back into data.
I'm not sure if this is actually being done. So hopefully I didn't just invent something new for them to use.
Re: (Score:2)
Re: (Score:2)
Except this would bypass any attempt atobfuscating or circumventing ETag tracking.
Re: (Score:1)
I'd like to point out that none of these methods work across sessions if you disable the on-disk web cache, which is a harmless thing to do and, provided you have a reasonably fast internet connection, does not slow down your web browsing. Caching web content to disk is unnecessary. The "network partitioning" in the next Firefox version is still a good thing, because it also segregates web sites during a session, but it will make the cache less effective, so you might as well disable the on-disk web cache c
Re: (Score:2)
I haven't disabled all caching because that would be wasteful for site bandwidth. And hyper-optimizing for low latency sites can bite
Re: (Score:2)
Have you tried to use ram disk as browser cache? In linux it's easy, just create an entry in /etc/fstab, with tmpfs as file system, and state the amount of memory to be allocated as ram disk. And then point the browser's cache to the ramdisk.
Re: (Score:2)
Re: (Score:2)
Then use Canvas API to read the pixels back into data.
They actually thought of this when designing getImageData [mozilla.org] so you can't get the pixel data for an image loaded from a third party domain - well, almost.
The actual concern was that images could contain sensitive information and that allowing canvas to access third party images could lead to cross-site scripting attacks using the canvas API. So both the source page and the destination web service are allowed to whitelist images that the canvas API can read [mozilla.org]. (That is, both have to allow it for it to be accessib
Re: (Score:2)
It can also check how long it took to load the image, if it was quick, it is cached, if it was slow, it was not cached. You can even play with DNS and http server to have different timings and test many resources, each for each tracked user/id and even find if some users are "related" to others (share the same network/computer)
Re: (Score:2)
Site A embeds "http://evilbullshittracker.com/tracker.gif" which is issued with some cache control headers including a unique etag, e.g. "abcd1234". Now when I go to site B, which also embeds that url, the browser asks evilbullshittracker.com if the etag "abcd1234" that was associated with the cached image is still good. Doesn't matter what the response is because evilbullshittracker.com now knows I visit site A and B and can aggregate its data. It might say yes its good &
Re:Great Idea (Score:5, Informative)
Why is your browser allowing websites to "probe your cache"?
Because it can't avoid it.
First, the most obvious: if the client has to check with the server if the resource is stale, then it has to contact the server. This contact at all can be used for tracking purposes.
Assuming you can force a revalidation, you can use the ETag [mozilla.org] to effectively set a "cookie" on a cached item so that you can uniquely determine what user cached a file. (The ETag is supposed to be an opaque value indicating to the server the version of the resource that was last seen by the client. So, for example, it could be an SHA1 hash of the resource. However, because it's opaque to the client, it doesn't mean anything and can be set to literally anything.)
Even without the ETag, you can still make cache hits unique per visitor, by sending unique "last modified" timestamps per user.
Of course, because of these kind of cache games, a lot of browsers simply don't revalidate resources as frequently as HTTP says they should. But if you can execute JavaScript, there are other games you can play.
If all you want to do is know if the user has ever loaded a given resource before, you can simply time the request. Happens immediately? Probably cached. Takes a bit? Probably not.
You can slightly change the CSS for each user for a cached stylesheet and then measure the layout in JavaScript, allowing you to fingerprint through the cache.
And there are probably more tricks out there to track someone via a cached shared resource. The only real solution is cache partitioning - which is what Firefox is doing.
Re: (Score:2)
I can think of several other "real solutions", but they all involve doing things similar to what Firefox is doing in a heavier handed/dumber way.
Re: (Score:2)
The only real solution is cache partitioning - which is what Firefox is doing.
Yep. It also has the side effect of making the buggers pay for bigger servers to handle the extra traffic. ie. It'll cost them money to try to track people this way.
Re: (Score:3)
What are the global warming effects of all corporate fatcats simultaneously rubbing their hands (paws) at the prospect of more cash?
How does this compare to "private"browsing (Score:2)
Are resources shared between private and normal browsing? Are resources shared between different private browsing windows?
Re: (Score:2)
Are resources shared between private and normal browsing?
No. My understanding of how every browser implements private browsing is that the private session basically gets a temporary in-memory cache and storage space, so things like local storage "work" for the duration of the private session, but once the private session ends, nothing is kept. Nothing is shared between the private and normal sessions.
Are resources shared between different private browsing windows?
That would be browser specific, and honestly, I have no clue! I'm fairly sure the initial implementation from Chrome had just the single "private session" that would
Re: (Score:1)
There are a lot of reasons depending on which cache you're talking about. In general, however, they reduce latency and page load times by allowing sites to "share" commonly-used resources. They also simplify access to resources that are not shared, but which would create a filesystem mess of ponderous proportions were they all stored separately.
I think Firefox is going to take a serious performance hit with this. It will make running multiple tabs in Chrome seem lightweight by comparison. And it's really un
Re: (Score:3)
I think Firefox is going to take a serious performance hit with this. It will make running multiple tabs in Chrome seem lightweight by comparison.
Google said Chrome is already going to do this. [zdnet.com] Firefox is playing catch-up.
Re: Great Idea (Score:2)
Re: Great Idea (Score:2)
The caching was essential during the early internet, but today it's no longer critical and with the segregation it will be a performance penalty for less visited sites while your most visited will behave normally.
Just neutralize the user agent string too in order to prevent blocking of Firefox. Otherwise we will see splash screens saying 'unsupported browser' and site access denied due to this.
Re: Great Idea (Score:5, Interesting)
Otherwise we will see splash screens saying 'unsupported browser' and site access denied due to this.
That, together with "this site is not available from your region" (due to GDPR), seems like a good warning sign that this is a site you'll want to avoid.
Re: Great Idea (Score:2)
My throttled mobile connection is weeping though.
Those legitimately shared resources are shared for a reason: They are large!
Have fun downloading that 5MB font file for every site on a 8kB/s connection when yoir data cap runs out. Which it will now earlier too.
Re: I'd like to block all fonts and caching (Score:4, Informative)
Welcome to Project Gemini [gemini.circumlunar.space]
Re: (Score:2)
My throttled mobile connection is weeping though.
Those legitimately shared resources are shared for a reason: They are large!
Have fun downloading that 5MB font file for every site on a 8kB/s connection when yoir data cap runs out. Which it will now earlier too.
Caching will still work, just not between sites.
Re: (Score:3)
Yeah, redownload the complete JQuery framework for every single site that uses it...
Re: (Score:3)
Re: (Score:2)
Not in this universe.
Re: (Score:3)
No it won't like Barefoot I live my online life on a metered connection. The trend with literally everything is basically to ignore the needs of anyone who does not have data-caps in the 1TB and more range. We are basically treated like we don't exist.
Updates - fuck you you'll download them when we say so! No options pick and choose only the critical ones any more in many systems or even let me use the system and wait until the end of my billing cycle
patches - fuck you no binary patches for things like Offi
Re: (Score:2)
Re: (Score:2)
Maybe it can be smart enough to allow cached files from non-spying CDN networks.
Re: (Score:2)
Yeah, I just had a case of RAS syndrome [wikipedia.org].
Re: (Score:2)
Re: (Score:1)
Uh, that's what the Titanic tried...
Re: (Score:2)
that is the same as having one process and one profile for each site. As each site loads several resources from other sites, you will end with more than 50 browsers to load one site. Not only a performance, cpu and memory hit, but then you will have to make then talk to each other. This is what you are saying...
What mozilla is trying to do is the same, but using threads and divide by blocks, as few as possible, but still isolating sites and features, both for tracking and security... while maintain the mem
Re: (Score:2)
Stop trying to figure out how each website is tracking you and simply segregate everything.
That would fox the issue, at least as far as that is possible when everything comes from a single IP. I think they do not actually want to fix the problem, but they want to appear to be fixing it.
OK, but don't bloat my RAM / disk space! (Score:1)
I expect this to still use hardlinks/pointers internally, to not store everything many times. Especially in memory.
Just keep track of who you are telling this file is already cached.
Oh, and regarding "highly technical"... What do you think this is? News for hairdressers?
Re: (Score:2)
The quoted article text is from a ZDNet article. Ie, it was not curated by the submitter and most certainly not by BaeauaeaHD (the post is probably a dupe, but I'm too lazy to look at the moment).
Your user id suggests you haven't been here long. Don't worry, you'll catch on.
Re: (Score:3)
Is Slashdot truly technical anymore? Half the posts on every thread are lightly educated right wing threadshitters
Re:OK, but don't bloat my RAM / disk space! (Score:4, Funny)
What do you think this is? News for hairdressers?
Yes! [slashdot.org]
So in short (Score:3)
Firefox now saves cached data in separate directories.
Re: (Score:1)
...and why wasn't this already a "thing"?
From a design standpoint this would have been the default implementation, so there must be some reason why it wasn't; I'd like to know why.
Re: (Score:3)
Cross-site stuff. A page uses jquery? Most pages don't host it on their own, they just link it from Google. Now instead of grabbing a ready copy from cache assigned to Google, Firefox will redownload it and store a copy in that site's own folder. Similarly, common fonts, themes etc. Deviantart with millions of artists, each with own name.deviantart.com domain, each time you visit new one, you redownload the entire layout, backgrounds, and all site scripts, because you can't just pull them from the common de
Re:So in short (Score:4, Interesting)
This is because of cname cloaking that makes 3rd party ad servers look like subdomains of the real site. It's widespread and must be blocked.
The small performance hit is worth it most of the time. You can locally cache popular frameworks using an add-on called Decentraleyes.
Re: (Score:2)
Re: (Score:2)
Do you think the advertisers trust the sites to proxy for them?
Re: (Score:2)
No, advertizers will hold ad cash hostage to force the sites to proxy them.
Re: (Score:1)
Ah, that makes sense. I still don't get why this wasn't the default behavior, however. Sure, it'll take up more disk space, but I can't see how it would slow page loads ( beyond the first download of course ), plus the security it'd bring would be worth it.
Perhaps it would be more useful (Score:2)
to just link your cache at /dev/null
only serious browser left (Score:2)
Firefox is the only serious browser left. Pretty much everything else is Chrome/Chromium. Safari doesn't count.
Re: (Score:2)
Safari and Chromium are still both based on KHTML. Not sure to what degree they are similar at all anymore.
containers? (Score:1)
worth it. (Score:2)
> "The Mozilla team expects [...] performance issues for sites loaded in Firefox, but it's willing to take the hit just to improve the privacy of its users."
Just as there is some performance overhead for SSL for security and privacy of http connection..
Partitioning to improve privacy protection is most welcome.
Highly Technical? (Score:3)
From TFS:
"Network Partitioning is highly technical"
Network Partitioning will allow Firefox to save resources like the cache, favicons, CSS files, images, and more, on a per-website basis, rather than together, in the same pool
Well, which is it? Is it highly technical? Or is it "Oooh, let me not allow this web site to access anything from cache that's from other sites, even though the sub-resources are the same"?
Sounds like a good idea (Score:2)
Great idea (Score:1)
Firefox doing it right again (Score:2)
Sounds like a good idea to me. The right path once again.