Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Firefox Chrome IT

Firefox Follows Chrome and Prepares To Block Insecure Downloads (therecord.media) 79

Mozilla developers are putting the finishing touches on a new feature that will block insecure file downloads in Firefox. From a report: Called mixed content downloaded blocking, the feature works by blocking files downloads initiated from an encrypted HTTPS page but which actually take place via an unencrypted HTTP channel. The idea behind this feature is to prevent Firefox users from getting misled by the URL bar and think they're downloading a file securely via HTTPS when, in reality, the file could be tampered with by third parties while in transit.
This discussion has been archived. No new comments can be posted.

Firefox Follows Chrome and Prepares To Block Insecure Downloads

Comments Filter:
  • by jellomizer ( 103300 ) on Monday August 23, 2021 @11:07AM (#61720741)

    Old XKCD [xkcd.com]
    We have an issue where creating a "trusted" download is getting more difficult to setup and use, also requiring more expensive and expansive architecture. Where the functionality is nearly impractical.

    Yes I get it why Firefox and Chome does it. However we need a way for us to get this stuff from an Untrusted site, just because we just may need someone from a person who we know.

    • This just means you can't use http on your already running https site. Also Let's Encrypt is free and takes all of 3 minutes to configure. Seriously. There's no reason not to use https.

      • Bah, I just want Telnet with the ZModem Protocol. XModem is just too old.

        • The PuTTY guy resists rolling in the zmodem patches [greenend.org.uk], despite the patches being available in LePutty and ExtraPutty.

          • He's not resisting, he's saying it's not worth the effort it'd take him but if someone else wants to submit patches he'll look at it.

            I get the same thing, if I rushed out to respond to every random feature request I'd never get any actual work done. Even if someone submitted patches I'd have to decide whether I want to add thousands of lines of unvetted code for a feature that 0.001% of users will ever use.

        • by Revek ( 133289 )
          Zmodem? Xmodem? Meh. If I'm going old school I'm rockin kermit.
          • I once had to reinstall a cisco routers os via xmodem and the console serial cable. That took forever even though I think the image was only 2MB. This was before even the 1600s. I think it was a 2501. You know, The ones with AUI for ethernet and HD serial port to connect to a V.35 CSU/DSU.
      • You try to show a friend of mine how to set up Let's Encrypt in 3 minutes and I buy you a beer.

        But I want to watch. The entertainment value of that failed attempt of yours would be worth way more than the beer.

        • The certbot CLI tool made for Let's Encrypt detects the server you are running and basically automates the process. If you're using Ubuntu it honestly takes less than 3 minutes.

          https://certbot.eff.org/ [eff.org]

          • by sjames ( 1099 )

            Great, I have an old switch with an expired self-signed cert. I can't wait to update it so I don't have to use an old browser to talk to it!

            • by vux984 ( 928602 )

              Why do you need an old browser? I still just click through the warning, advanced, accept the risk, and continue.

              Now if the old device is some masterwork of broken IE specific html code with ActiveX plugins then yeah... time to fire up the old Windows XP VM because its not worth trying to make it work on anything else, but thats not certificate related.

              • by sjames ( 1099 )

                Every version, the browser gives you less options to click through the risk. If a page uses javascript that calls XMLHttpRequest, you may not even get a chance to click through, it'll just fail.

                You might want to archive a browser that works on your embedded hardware now against the future when it disappears "for your own good".

                • by vux984 ( 928602 )

                  Sure that's generally good advice if you need to manage old gear, but it's more to do with general compatibility issues than anything to do with certificates.

                  I do see the argument that there should be a special browser for IT admins and developers with a few more overrides and what not for security issues to make dealing with development issues and old equipment simpler, but then that browser shouldn't be too easily readily available to the general public nor should it be used on the public internet... mayb

                  • by sjames ( 1099 )

                    You started to have the right idea, then minced it with restricted access (how should one get such access?) and restricted IPs because browser maker knows best and children shouldn't be allowed to stray from home (or something like that).

                    How about normally being secured and unless the user is ACTUALLY a child, letting them make the adult decision to bypass the protections as needed? Feel free to display a warning featuring the skull and crossbones, Mr. Yuk, and other discouraging iconography. If you must, d

                    • by vux984 ( 928602 )

                      "How about normally being secured and unless the user is ACTUALLY a child, letting them make the adult decision to bypass the protections as needed?"

                      I have no issue with having a secure tool with no bypasses, and making available separate insecure tool with specific bypasses enabled but restricted to LAN ranges with its own logo and brand name. There is no good reason to use that online, and differentiating the two products by use case is sensible.

                      And if you really really want the insecure version to use on

                    • by sjames ( 1099 )

                      I still object to the LAN restriction. I am not only all growed up, I am an experienced professional. I don't need my hand held. I'm not going to be bypassing anything on some random site, but if the best I can do to avoid a 4 hour drive (each way) is to very temporarily nail up an inbound NAT, I would like for the browser to obey my commands and just do it so I can fix the damned thing and get back to bed.

                    • by vux984 ( 928602 )

                      "I still object to the LAN restriction. I am not only all growed up, I am an experienced professional."

                      There are millions of old laser printers and whatever else floating around, that average joes need to 'manage'. I view this tool as being as much for them as anything else.

                      "but if the best I can do to avoid a 4 hour drive (each way) is to very temporarily nail up an inbound NAT,"

                      Set up an outbound NAT on your side to alias a LAN ip address to a remote address, or use a vpn, or an HTTP proxy server... yeah

                    • by sjames ( 1099 )

                      Or I could hopefully have a tool that doesn't assume I'm a child.

                    • by vux984 ( 928602 )

                      If the average person didn't act like a child we wouldn't be having this conversation.

              • by jythie ( 914043 )
                Annoyingly enough, modern browsers have been playing with taking away the ability to click through. On OSX I generally have to switch over to safari to access my NAS because chrome took the option away.
          • by Luckyo ( 1726890 )

            I see that you haven't tried to teach these "automated processes" to an average user. I'm in on the beer offer with the parent, but I too want to watch the hilarity that will ensue as you fail.

          • Does it automatically detect the fully-qualified domain name of the server? Because a lot of servers on home LANs don't have one, and CAs don't sign certificates for mDNS names that end in .local.

        • Getting your first certificate setup is probably a 3-10 minute job, downloading and waiting for the package installer to finish is the bulk of the time. Getting autorenew to work is more time if your web server has any unusual configuration to it. Signing in and renewing every 30 days by hand is a minutes worth of work if you aren't willing or able to autorenew.

      • I would say if you add reading the documentation it takes at least an hour.
      • by sjames ( 1099 )

        There are still great reasons to not use https. For example, embedded devices that may not have a fixed IP or DNS.

        • by jythie ( 914043 )
          *nod* I really wish they had separated the two functions better. There are all sorts of situations where you might want the encrypted connection that https provides but not care about the certificate chain. You would think with the rise of IoT devices this would have become a big deal by now.
          • by tepples ( 727027 )

            If you care about the encryption and do not care quite as much about the certificate chain, the usual solution is to run a private CA, issue a certificate, sneakernet the certificate to the server, and configure your devices to trust the private CA.

            • by jythie ( 914043 )
              Yeah, there are all sorts of workarounds like that which require a bunch of specialized knowledge, time, and resources. It would have been nice if they had just bothered to include 'yes, I just want to do a key exchange with this server so the channel is encrypted' option like they did with SSH. Could you imagine how much of a pain SSH would be if it followed the same rules as HTTPS and required that kind of hack just to connect to the box down the hall?
      • by joaommp ( 685612 )

        Actually, there are lots of reasons. Not all material needs to be encrypted, because not all material is privacy sensitive. Which means, the only thing left to deal with is tampering.

        A better alternative than blocking it altogether - because that mostly renders transparent proxies useless - is to force links to HTTP content in HTTPS transferred sites to be accompanied by a new attribute in the anchor tag with a file signature (or a link to an also HTTPS-only file containing the signatures) that could be che

    • by gweihir ( 88907 )

      However we need a way for us to get this stuff from an untrusted site

      That way has existed for a long, long time. Just add and verify a PGP/GnuPG signature. You need to get the public key via a trusted channel, verify it via its signatures or its fingerprint, or, at the very least get it over a different channel (like a VPN or the TOR browser or a different internet connection). If you have lower security needs, it may also be enough when the signatures check out for several downloads spread over a longer time.

      • This update makes sense to me. If the website is https, the file transfers should be https too. This day and age there is no reason why websites aren't fully https except due to misconfigured webservers or cheap companies trying to scrimp on server resources.
      • 99% of the population won't have the slightest clue what anything you wrote means. Might as well tell them to reverse the polarity of their tachyon emitter array.
        • Might as well tell them to reverse the polarity of their tachyon emitter array.

          That's an expensive way to sign a file, but at least they'll understand it.

    • by Z00L00K ( 682162 )

      Today we can't detect if a site is trusted or untrusted anyway regardless of http or https because we never will be able to know. There are too many CAs out there and how can we know if one is compromised in some intricate way?

    • by AmiMoJo ( 196126 )

      Mozilla actually built a service to make sending files easy. It was called Firefox send and they killed it last year due to nobody using it. Turns out everyone already has a Google Drive or OneDrive or Dropbox account, and you can download from them without an account which seems to be the issue that XKCD has.

    • We have an issue where creating a "trusted" download is getting more difficult to setup and use, also requiring more expensive and expansive architecture.

      No we don't. This is saying you already have the infrastructure and all capabilities, so offer the download the same way. It's saying stop being lazy at the expense of security.

  • by gweihir ( 88907 ) on Monday August 23, 2021 @11:23AM (#61720805)

    No 3rd party will invest much into tampering with files in transit over HTTP. It is just too much effort. Instead, they will redirect to a download server of their own (with a apparently valid certificate) or tamper with the file before it is downloaded. Hence all this does is create a false sense of security.

    The gold-standard is and likely remains to have a PGP/GnuPG signature on the file and to check that after download.

    • This is what you're looking for

      https://developer.mozilla.org/... [mozilla.org]

      • by gweihir ( 88907 )

        I was not looking for it, but it nicely confirms my assessment. Any integrity verification aimed at tampering that does not use cryptographic signatures is bogus.

        • Not sure why you're obsessed with PGP in this instance. If someone can modify the hash used in the tag they can also generate their own PGP key and sign it themselves. The end result is the same.

          • by gweihir ( 88907 )

            Not sure why you're obsessed with PGP in this instance. If someone can modify the hash used in the tag they can also generate their own PGP key and sign it themselves. The end result is the same.

            You do not seem to understand how cryptographic signatures work. Of course that public key needs to be verified in some way for the signature to have value. A hash cannot be verified.

            • by tepples ( 727027 )

              A hash value that comes from a verified document is as verified as the document it comes from.

              • by gweihir ( 88907 )

                A hash value that comes from a verified document is as verified as the document it comes from.

                It does. But there is no easy way to communicate a verified document. Unless there is a cryptographic signature on it and you verify the public key. Verifying public keys is usually easy. Verifying hash values is not.

                Hence the whole idea to do this with a hash only is pretty much bogus.

                • I understand the subresource integrity (SRI) flow as providing the URL and hash of the downloadable file over HTTPS and providing the file itself over clear HTTP. Because the hash comes from a document served over HTTPS, it is a verified document.

    • but these days you can get multi-million dollar payouts from one breach and with crypto get away scott free (most ACH transfers can be clawed back or traced thanks to decades of laws). That kind of money makes absurd things worth while, especially as word gets around about the easier attack vectors.
      • by gweihir ( 88907 )

        The only way to reasonably to do this is very close to client or server. In that case it is often already possible to compromise client or server directly, no need to tamper with any data in flight to compromise any binary transfers after that.

    • by AmiMoJo ( 196126 )

      Remember that Google has download scanning built into Chrome already. It uses a list of known malware but also a list of known malware domains and URLs. Google collects that data both from its own web crawler and from Chrome users who have malware protection enabled submitting URLs automatically.

      So for Google requiring certificates is very helpful, because it prevents sites impersonating others. It also increases the cost of doing it, and makes it easier to identify groups of malware domains that were regis

      • by gweihir ( 88907 )

        Requiring certificates does not really prevent impersonation. CAs get compromised all the time or get tricked into signing certificates for the wrong people. As an identity-verification means, certificates are not very useful.

        • by AmiMoJo ( 196126 )

          That's why Google stopped using certificates for identity verification.

          That has nothing to do with this issue though. Getting a certificate for a domain requires a few basic things and some time. Once identified as malware it's easy for the CA authority to cancel batches of certificates linked to an email address, or an IP address that created them all. Nothing bulletproof but it all adds to the hassle that is required to distribute malware.

    • Comment removed based on user account deletion
  • by MrL0G1C ( 867445 ) on Monday August 23, 2021 @11:28AM (#61720841) Journal

    when, in reality, the file could be tampered with by third parties while in transit.

    Does this ever happen? Wifi is encrypted although admittedly the older protocols are no longer safe. But anyhow, since when does anyone ever do main-in-the-middle attacks, other than the gov't secret services and this wont stop them anyway.

    This 'feature' is nannying and should be easily over-ridden. If you don't trust the network then you can always encrypt files before you send them.

    Idiots will still download trojan pron.zip.exe files regardless of whether you encrypt the network protocol.

    • Re:Trust (Score:4, Interesting)

      by _0x0nyadesu ( 7184652 ) on Monday August 23, 2021 @11:43AM (#61720923)

      This is actually way worse than someone opening a trojan cause you can hack people who do everything right and are seemingly unaware that their DNS has been hijacked.

      If someone overrode the local DNS for say google's domain to instead point to some local IP instead of the real google server they could introduce arbitrary javascript to a real script and simply forward all requests to the real server then proceed to start stealing session cookies among other things by appending some of their own javascript to the official code.

      The costs to accomplish this are low as well. You only need to tap into an ethernet jack and spoof some other mac address on the network that you happen to see nearby. You could even take over DHCP duties by answering broadcasts first and make machines think you're also the real DNS server.

      I'm sure there's tons of engineers that think like you that work at big companies and don't want to https their local websites but you do so at great risk if you have any public facing terminals.... like say at an airport or a grocery store chain.

      • by ftobin ( 48814 )

        DNS hijacking isn't really an issue with Firefox, since it does DoH.

        • Translation: ... since it already is hijacked.

          I got my own DNS server, thank you very much for ruining it, Mozilla!

  • by PPH ( 736903 )

    I already know how to verify checksums with a signed certificate. If I want to fetch something using HTTP or FTP, I can just use my dedicated up/download client and bypass Chrome/Firefox altogether.

  • by Anonymous Coward

    "Directly accessed HTTP download links (copy-pasted in the Firefox address bar) will not be blocked."

  • by oldgraybeard ( 2939809 ) on Monday August 23, 2021 @12:55PM (#61721379)
    "Firefox Follows Chrome"
    • Yeah, fuck Chrome. All other browsers should stand their ground. Heck go opposite direction. Remove TLS3 support. Disable warnings over HTTPS, we can't be seen doing anything security related, that would be too much "Chrome".

      • No, remove HTML5 support! And tabs!

        As I said for years: The browser is just a worse virtual machine at this point.
        Just run a virtual machine with paged-out cached snapshots of several operating systems, that when you enter a URL, are cloned in milliseconds with copy-on-write, get the file downloaded from the URL mounted as a drive, and the executable inside autostarted. And have a *two-level* task bar. So not a separate tab bar somewhere else, but a task bar that has an always-visible subwindow bar on top o

        • OK, to be frank: Last time I posted this idea, I did two evenings of development afterwards, and half of the above already works. I only have to migrate to a proper hypervisor mode because otherwise qemu's tools are annoying and badly documented.
          All that's missing now, is the URL bar. (Easy. a borderless terminal window with a shell script and fixed positions set in the window manager will do.) And that two-level tab bar. (The graphics side of that is already done. Only the controller/input is missing.)
          And

  • This is perfectly fine as long as it can be disabled. However, Firefox is getting very close to being totally unuseable with the last update that really and truly fucks up the UI beyond reconition and renders it completely unuseable (and cannot be disabled).

    Mozilla Firefox is on life support and will disappear into the annals of history shortly.

    --
    The only good politician is a dead politician

    • No, "opt-out" is, legally, never perfectly fine. *Opt-IN* would be perfectly fine.

      Just like you *first* ask you friend, before you decide to lock up his back door in his own home. Might be a good idea, but you're not his dominatrix, now are you?

  • if you are standing up against "tampered with by third parties while in transit' Read more about Mozilla's decision [mozilla.org] and CloudFlare problems [fuwafuwa.moe].
  • The fallacy behind the entire thing is, that Mozilla just arrogantly assumes it knows what is to be trusted, and knows better than me too.

    Typical of such companies. They don't want to help you. They want to *dominate* you!

    It's called a God complex.

  • It just breaks shit without any benefit.
  • First they remove support for FTP [slashdot.org].

    Now they remove support for HTTP.

    They are making it impossible to directly send/receive files between parties without needing to utilize a 3rd party source. This introduces as much vulnerability as it solves but also maximizes inconvenience and difficulty while minimizing privacy.

    But that may be the point. Browsers have long supported the trend away from direct internet access and towards dependency on walled gardens. And to think we once thought AOL confusing the li

A committee takes root and grows, it flowers, wilts and dies, scattering the seed from which other committees will bloom. -- Parkinson

Working...