1. 15
  1.  

  2. 14

    Google can’t track you on FTP, and also AMP is not needed there =)

    1. 0

      pretty much anyone can track you on FTP, it’s an unencrypted protocol.

      1. 5

        Using cookies? I don’t think so. Unencrypted means ISP can track you, yes.

        1. 3

          I don’t think I can, since I (1) don’t work at your or the server’s ISP and (2) I’m not in the neighbourhood. Feel free to prove me wrong. ;)

      2. 4

        I’m happy to see FTP die. But aren’t some websites still providing download links over FTP? I think it was just a year ago when I noticed I was downloading an ISO file from an FTP server..

        1. 9

          There’s nothing wrong with downloading an ISO from an FTP server. You can verify the integrity of a download (as you should) independently of the mechanism (as many package managers do).

          1. 4

            I agree! The same goes for downloading files from plain HTTP, as long as you verify the download you know the file is okay.

            The reason I don’t like FTP has to do with the mode of operation; port 21 as control channel and then a high port for actual data transfer. Also the fact that there is no standard for directory listings (I think DOS-style listings are the most common?).

            1. 2

              The reason there’s no standard for directory listings is possibly more to do with the lack of convention on filesystem representation as it took off. Not everything uses the same delimiter, and not everything with a filesystem has files behind it (e.g. Z-Series).

              I absolutely think that in the modern world we should use modern tools, but FTP’s a lot like ed(1): it’s on everything and works pretty much anywhere as a fallback.

              1. 1

                If you compare FTP to ed(1), I’d compare HTTP and SSH to vi(1). Those are also available on virtually anywhere.

                1. 1

                  According to a tweet by Steven D. Brewer, it seems that at least modern Ubuntu rescue disks only ship nano, but not ed(1) or vi(1)/vim(1).

                  1. 1

                    Rescue disks are a special case. Space is a premium.

                    My VPS running some Ubuntu version does return output from man ed. (I’m not foolish enough to try to run ed itself, I quite like have a usable terminal).

              2. 1

                Yes, FTP is a vestige of a time where there was no NAT. It was good until the 90s and has been terrible ever since

              3. 1

                Most people downloading files over FTP using Chrome don’t even know what a hash is, let alone how to verify one.

                1. 1

                  That’s not really an argument for disabling FTP support. That’s more of an argument for implementing some form of file hash verification standard tbh.

                2. 1

                  There is everything wrong with downloading an ISO over FTP.

                  Yeah, you can verify the integrity independently. But it goes against all security best practice to expect that users will do something extra to get security.

                  Security should happen automatically whenever possible. Not saying that HTTPS is the perfect way to guarantee secure downloads. But at the very least a) it works without requiring the user to do anything special and b) it protects against trivial man in the middle attacks.

                  1. 1

                    But it goes against all security best practice to expect that users will do something extra to get security.

                    Please don’t use the term best practice, it’s a weasel term that makes me feel ill. I can get behind the idea that an expectation that users will independently verify integrity is downright terrible UX. It’s not an unrealistic expectation that the user is aware of an integrity failure. It’s also not unrealistic that it requires the user to act specifically to gain some demonstrable level of security (in this case integrity)

                    To go further, examples that expect users to do something extra to get security (for some values of security) include:

                    1. PGP
                    2. SSH
                    3. 2FA

                    Security should happen automatically whenever possible.

                    And indeed, it does. Even over FTP

                    Not saying that HTTPS is the perfect way to guarantee secure downloads

                    That’s good because HTTPS doesn’t guarantee secure downloads at all. That’s not what HTTPS is designed for.

                    You’ve confused TLS (a transport security mechanism) with an an application protocol built on top of TLS (HTTPS) and what it does with the act of verifying a download (which it doesn’t). The integrity check in TLS exists for the connection, not the file. It’s a subtle but important difference. If the file is compromised when transferred (e.g. through web of trust, through just being a malicious file) then TLS won’t help you. When integrity is important, that integrity check needs to occur on the thing requiring integrity.

                3. 7

                  You got it backwards.

                  Yeah, some sites still ofter FTP downloads, even for software, aka code that you’re gonna execute. So it’s a good thing to create some pressure so they change to a more secure download method.

                  1. 9

                    Secure against what? Let’s consider the possibilities.

                    Compromised server. Transport protocol security is irrelevant in that case. Most (all?) known compromised download incidents are of this type.

                    Domain hijacking. In that case nothing prevents attacker from also generating a cert that matches the domain, the user would have to verify the cert visually and know what the correct cert is supposed to be—in practice that attack is undetectable.

                    MitM attack that directs you to a wrong server. If it’s possible in your network or you are using a malicious ISP, you are already in trouble.

                    I would rather see Chrome stop sending your requests to Google if it thinks it’s not a real hostname. Immense effort required to support FTP drains all their resources and keeps them from making this simple improvemen I guess.

                    1. 1

                      MitM attack that directs you to a wrong server. If it’s possible in your network or you are using a malicious ISP, you are already in trouble.

                      How so? (Assuming you mostly use services that have basic security, aka HTTPS.)

                      What you call “malicious ISP” can also be called “open wifi” and it’s a very common way for people to get online.

                      1. 1

                        The ISP must be sufficiently malicious to know exactly what are you going to download and setup a fake server with modified but plausibly looking versions of the files you want. An attacker with a laptop in an open wifi network doesn’t have resources to do that.

                        Package managers already have signature verification built-in, so the attack is limited to manual downloads. Even with resources to setup fake servers for a wide range of projects, one can wait a long time for the attack to succeed.