Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Firefox Mozilla Network Privacy

Firefox To Ship 'Network Partitioning' As a New Anti-Tracking Defense (zdnet.com) 65

An anonymous reader quotes a report from ZDNet: Firefox 85, scheduled to be released next month, in January 2021, will ship with a feature named Network Partitioning as a new form of anti-tracking protection. The feature is based on "Client-Side Storage Partitioning," a new standard currently being developed by the World Wide Web Consortium's Privacy Community Group. "Network Partitioning is highly technical, but to simplify it somewhat; your browser has many ways it can save data from websites, not just via cookies," privacy researcher Zach Edwards told ZDNet in an interview this week. "These other storage mechanisms include the HTTP cache, image cache, favicon cache, font cache, CORS-preflight cache, and a variety of other caches and storage mechanisms that can be used to track people across websites." Edwards says all these data storage systems are shared among websites.

The difference is that Network Partitioning will allow Firefox to save resources like the cache, favicons, CSS files, images, and more, on a per-website basis, rather than together, in the same pool. This makes it harder for websites and third-parties like ad and web analytics companies to track users since they can't probe for the presence of other sites' data in this shared pool. The Mozilla team expects [...] performance issues for sites loaded in Firefox, but it's willing to take the hit just to improve the privacy of its users.

This discussion has been archived. No new comments can be posted.

Firefox To Ship 'Network Partitioning' As a New Anti-Tracking Defense

Comments Filter:
  • Stop trying to figure out how each website is tracking you and simply segregate everything.

    • by Anonymous Coward

      this makes it harder for websites and third-parties like ad and web analytics companies to track users since they can't probe for the presence of other sites' data in this shared pool.

      Why are they able to do that at all? Why is your browser allowing websites to "probe your cache"? THAT is the real problem here.

      • Re: Great Idea (Score:5, Informative)

        by BAReFO0t ( 6240524 ) on Monday December 21, 2020 @11:15PM (#60855822)

        Uum, maybe you should first understand the basics before talking loudly.

        When a page contains an image, and the client requests the page, but not the image, they assume it is cached. When it is requested, they do not. That is all that "probing" is.
        The trick is to have those includes be unique for you, and not blocked by ad blockers. Which is not hard if you randomize the paths and code enough and do in in obfuscated scripts that ad blockers can't block without breaking the site. Essentially the same techniques as polymorphic viruses.

        • By the way, they used to probe your internal network the same way. Request https://yourrouter/ [yourrouter] internally, and if you just happen to be logged in in another tab, they'd be in too, and install an external login, and configure a MITM. Apart from many other fun things.

        • You don't even have to randomize the path for image caching. Just encode the unique identifier in the image pixels and tell the browser to cache it. Then use Canvas API to read the pixels back into data.

          I'm not sure if this is actually being done. So hopefully I didn't just invent something new for them to use.

          • You basically described ETags [wikipedia.org]. Not only is it already being done, it's also already being exploited to track users who've disabled cookies and JS.
            • Except this would bypass any attempt atobfuscating or circumventing ETag tracking.

            • I'd like to point out that none of these methods work across sessions if you disable the on-disk web cache, which is a harmless thing to do and, provided you have a reasonably fast internet connection, does not slow down your web browsing. Caching web content to disk is unnecessary. The "network partitioning" in the next Firefox version is still a good thing, because it also segregates web sites during a session, but it will make the cache less effective, so you might as well disable the on-disk web cache c

              • by Bengie ( 1121981 )
                On my at-home fixed line internet, many web sites load faster with all caching disabled. Just popup dev mode, check "disable caching" and voila, *up-to* 20% faster for some sites. Even Slashdot is about 10% faster. I disabled on-disk caching to reduce SSD wear, wasted space, and most of the cached content of the main sites I load gets "raced" and the network generally wins.

                I haven't disabled all caching because that would be wasteful for site bandwidth. And hyper-optimizing for low latency sites can bite
                • Have you tried to use ram disk as browser cache? In linux it's easy, just create an entry in /etc/fstab, with tmpfs as file system, and state the amount of memory to be allocated as ram disk. And then point the browser's cache to the ramdisk.

                  • by Bengie ( 1121981 )
                    The browser cache is fundamentally slower than my network. That's the issue. Doesn't matter how you back the data. Firefox has an in-memory cache, optimized for being in the application memory, and even that is slower than my internet connection. Either Firefox has some horrible cache algorithm or the complexities of properly caching resources is such that the CPU time spent it greater than the IO costs.
          • by _xeno_ ( 155264 )

            Then use Canvas API to read the pixels back into data.

            They actually thought of this when designing getImageData [mozilla.org] so you can't get the pixel data for an image loaded from a third party domain - well, almost.

            The actual concern was that images could contain sensitive information and that allowing canvas to access third party images could lead to cross-site scripting attacks using the canvas API. So both the source page and the destination web service are allowed to whitelist images that the canvas API can read [mozilla.org]. (That is, both have to allow it for it to be accessib

        • by higuita ( 129722 )

          It can also check how long it took to load the image, if it was quick, it is cached, if it was slow, it was not cached. You can even play with DNS and http server to have different timings and test many resources, each for each tracked user/id and even find if some users are "related" to others (share the same network/computer)

        • by DrXym ( 126579 )
          Actually it's more like this.

          Site A embeds "http://evilbullshittracker.com/tracker.gif" which is issued with some cache control headers including a unique etag, e.g. "abcd1234". Now when I go to site B, which also embeds that url, the browser asks evilbullshittracker.com if the etag "abcd1234" that was associated with the cached image is still good. Doesn't matter what the response is because evilbullshittracker.com now knows I visit site A and B and can aggregate its data. It might say yes its good &

      • Re:Great Idea (Score:5, Informative)

        by _xeno_ ( 155264 ) on Monday December 21, 2020 @11:25PM (#60855846) Homepage Journal

        Why is your browser allowing websites to "probe your cache"?

        Because it can't avoid it.

        First, the most obvious: if the client has to check with the server if the resource is stale, then it has to contact the server. This contact at all can be used for tracking purposes.

        Assuming you can force a revalidation, you can use the ETag [mozilla.org] to effectively set a "cookie" on a cached item so that you can uniquely determine what user cached a file. (The ETag is supposed to be an opaque value indicating to the server the version of the resource that was last seen by the client. So, for example, it could be an SHA1 hash of the resource. However, because it's opaque to the client, it doesn't mean anything and can be set to literally anything.)

        Even without the ETag, you can still make cache hits unique per visitor, by sending unique "last modified" timestamps per user.

        Of course, because of these kind of cache games, a lot of browsers simply don't revalidate resources as frequently as HTTP says they should. But if you can execute JavaScript, there are other games you can play.

        If all you want to do is know if the user has ever loaded a given resource before, you can simply time the request. Happens immediately? Probably cached. Takes a bit? Probably not.

        You can slightly change the CSS for each user for a cached stylesheet and then measure the layout in JavaScript, allowing you to fingerprint through the cache.

        And there are probably more tricks out there to track someone via a cached shared resource. The only real solution is cache partitioning - which is what Firefox is doing.

        • I can think of several other "real solutions", but they all involve doing things similar to what Firefox is doing in a heavier handed/dumber way.

        • The only real solution is cache partitioning - which is what Firefox is doing.

          Yep. It also has the side effect of making the buggers pay for bigger servers to handle the extra traffic. ie. It'll cost them money to try to track people this way.

          • Don't forget the users! They have to pay for a bigger data allowance when the content can't be cached!

            What are the global warming effects of all corporate fatcats simultaneously rubbing their hands (paws) at the prospect of more cash?
        • Are resources shared between private and normal browsing? Are resources shared between different private browsing windows?

          • by _xeno_ ( 155264 )

            Are resources shared between private and normal browsing?

            No. My understanding of how every browser implements private browsing is that the private session basically gets a temporary in-memory cache and storage space, so things like local storage "work" for the duration of the private session, but once the private session ends, nothing is kept. Nothing is shared between the private and normal sessions.

            Are resources shared between different private browsing windows?

            That would be browser specific, and honestly, I have no clue! I'm fairly sure the initial implementation from Chrome had just the single "private session" that would

      • There are a lot of reasons depending on which cache you're talking about. In general, however, they reduce latency and page load times by allowing sites to "share" commonly-used resources. They also simplify access to resources that are not shared, but which would create a filesystem mess of ponderous proportions were they all stored separately.

        I think Firefox is going to take a serious performance hit with this. It will make running multiple tabs in Chrome seem lightweight by comparison. And it's really un

      • The caching was essential during the early internet, but today it's no longer critical and with the segregation it will be a performance penalty for less visited sites while your most visited will behave normally.
        Just neutralize the user agent string too in order to prevent blocking of Firefox. Otherwise we will see splash screens saying 'unsupported browser' and site access denied due to this.

        • Re: Great Idea (Score:5, Interesting)

          by dromgodis ( 4533247 ) on Tuesday December 22, 2020 @02:59AM (#60856066)

          Otherwise we will see splash screens saying 'unsupported browser' and site access denied due to this.

          That, together with "this site is not available from your region" (due to GDPR), seems like a good warning sign that this is a site you'll want to avoid.

    • My throttled mobile connection is weeping though.

      Those legitimately shared resources are shared for a reason: They are large!
      Have fun downloading that 5MB font file for every site on a 8kB/s connection when yoir data cap runs out. Which it will now earlier too.

      • My throttled mobile connection is weeping though.

        Those legitimately shared resources are shared for a reason: They are large!
        Have fun downloading that 5MB font file for every site on a 8kB/s connection when yoir data cap runs out. Which it will now earlier too.

        Caching will still work, just not between sites.

        • Yeah, redownload the complete JQuery framework for every single site that uses it...

          • So the outcome might be... sites stop serving up a load of ****ing code they don't need!
            • Not in this universe.

            • by DarkOx ( 621550 )

              No it won't like Barefoot I live my online life on a metered connection. The trend with literally everything is basically to ignore the needs of anyone who does not have data-caps in the 1TB and more range. We are basically treated like we don't exist.

              Updates - fuck you you'll download them when we say so! No options pick and choose only the critical ones any more in many systems or even let me use the system and wait until the end of my billing cycle

              patches - fuck you no binary patches for things like Offi

              • Your browser could REQUEST the images regardless of if they are cached or not. Just because you request them, doesn't mean you have to USE them in the page (or even continue the TCP connection for them).
          • Maybe it can be smart enough to allow cached files from non-spying CDN networks.

      • Whether you're a data centre, or mobile ISP, it's win-win!
    • by Tablizer ( 95088 )

      Uh, that's what the Titanic tried...

    • by higuita ( 129722 )

      that is the same as having one process and one profile for each site. As each site loads several resources from other sites, you will end with more than 50 browsers to load one site. Not only a performance, cpu and memory hit, but then you will have to make then talk to each other. This is what you are saying...
      What mozilla is trying to do is the same, but using threads and divide by blocks, as few as possible, but still isolating sites and features, both for tracking and security... while maintain the mem

    • by gweihir ( 88907 )

      Stop trying to figure out how each website is tracking you and simply segregate everything.

      That would fox the issue, at least as far as that is possible when everything comes from a single IP. I think they do not actually want to fix the problem, but they want to appear to be fixing it.

  • I expect this to still use hardlinks/pointers internally, to not store everything many times. Especially in memory.
    Just keep track of who you are telling this file is already cached.

    Oh, and regarding "highly technical"... What do you think this is? News for hairdressers?

  • by Rosco P. Coltrane ( 209368 ) on Monday December 21, 2020 @11:44PM (#60855870)

    Firefox now saves cached data in separate directories.

    • ...and why wasn't this already a "thing"?

      From a design standpoint this would have been the default implementation, so there must be some reason why it wasn't; I'd like to know why.

      • Cross-site stuff. A page uses jquery? Most pages don't host it on their own, they just link it from Google. Now instead of grabbing a ready copy from cache assigned to Google, Firefox will redownload it and store a copy in that site's own folder. Similarly, common fonts, themes etc. Deviantart with millions of artists, each with own name.deviantart.com domain, each time you visit new one, you redownload the entire layout, backgrounds, and all site scripts, because you can't just pull them from the common de

        • Re:So in short (Score:4, Interesting)

          by AmiMoJo ( 196126 ) on Tuesday December 22, 2020 @09:06AM (#60856506) Homepage Journal

          This is because of cname cloaking that makes 3rd party ad servers look like subdomains of the real site. It's widespread and must be blocked.

          The small performance hit is worth it most of the time. You can locally cache popular frameworks using an add-on called Decentraleyes.

        • Ah, that makes sense. I still don't get why this wasn't the default behavior, however. Sure, it'll take up more disk space, but I can't see how it would slow page loads ( beyond the first download of course ), plus the security it'd bring would be worth it.

  • to just link your cache at /dev/null

  • Firefox is the only serious browser left. Pretty much everything else is Chrome/Chromium. Safari doesn't count.

    • Safari and Chromium are still both based on KHTML. Not sure to what degree they are similar at all anymore.

  • How does this compare to containers? Is it like a container per site?
  • > "The Mozilla team expects [...] performance issues for sites loaded in Firefox, but it's willing to take the hit just to improve the privacy of its users."

    Just as there is some performance overhead for SSL for security and privacy of http connection..

    Partitioning to improve privacy protection is most welcome.

  • by lsllll ( 830002 ) on Tuesday December 22, 2020 @03:11AM (#60856090)

    From TFS:

    "Network Partitioning is highly technical"

    Network Partitioning will allow Firefox to save resources like the cache, favicons, CSS files, images, and more, on a per-website basis, rather than together, in the same pool

    Well, which is it? Is it highly technical? Or is it "Oooh, let me not allow this web site to access anything from cache that's from other sites, even though the sub-resources are the same"?

  • Storage impact should be minimal (it can use reference counting with a hash lookup) but it's likely to use more network traffic because even if there is a copy of some file in the cache for one site the browser will have to fetch it again for the other in order to know if its the same file.
  • This is another great idea and one more stone in a long path to privacy. Having a fast computer I will still shutdown my browser when navigating from one website to another.
  • Sounds like a good idea to me. The right path once again.

Keep up the good work! But please don't ask me to help.

Working...