1. 62
  1.  

  2. 30

    It implements its own scrolling behavior on iOS

    This should honestly be a capital crime. I mean, stop it. My OS handles scrolling. All of my apps scroll the same way. Don’t go fuck it up because you have opinions™ about how scrolling should work.

    1. 11

      Apparently that is not the entire story: https://news.ycombinator.com/item?id=14384938

      Doesn’t change the fact that AMP should not be used though.

      1. 8

        Technically, it seems it’s the result of putting content into an overflow hidden div and some other whatnot. But the result is similar. It’s a web page not constructed like other web pages, and thus behaves not like other pages. Tap top to scroll up doesn’t work when the entire page is crammed into a single one screen high element because technically you’re already at the top of page.

      2. 4

        Perhaps a bit ironic, but I feel the same way about most of the modern JavaScript around the web, as well as the mandatory https.

        For example, it’s not possible to view Lobsters on slightly outdated browsers / platforms, because it now requires not just https, but also TLSv1.2 specifically. Why must I have TLSv1.2 in order to view tech news?

        Kill https before it kills the web.

        1. 10

          Not supporting TLSv1.2 entails being more than “slightly” outdated. All major browsers have supported it for more than three years now; and while this is an unfortunate condemnation of the current state of the web, using a three-year-old browser release is tantamount to negligence. They are enormously complex edifices of software with serious vulnerabilities discovered all the time, whose sole purpose is to consume external, untrusted content.

          1. 6

            I wouldn’t use a 3-year-old version of a full-fledged graphical web browser like Firefox or Chrome, but it’s not especially dangerous to be using a 3-year-old lynx. I believe the last exploitable vulnerability in lynx was in 2010 (there’s been one CVE assigned since then, but it’s around password handling, not something remote-exploitable). And lobste.rs renders fine in lynx.

            Now if you require https, then we’re talking about lynx+openssl, which has much more frequent vulnerabilities.

            1. 7

              Is anyone looking? Does lynx do the kind of testing that would catch vulnerabilities? Given the record of e.g. curl, I would assume lynx has exploitable vulnerabilities.

            2. 2

              “More than three years”? Are you kidding me?!

              Isn’t it ironic that out of the bare minimum stack of software required to view Lobsters, the TLS stack is most certainly the only part that requires patching on a monthly basis?

              And that apart from TLS, the site would certainly be backwards compatible for something like 16 years back, if not more?

              1. [Comment from banned user removed]

                1. 3

                  It’s actually the opposite. Since Lobsters requires TLSv1.2, and most of the rest of the internet does not, it’s Lobsters that has a prerequisite of having a modern browser.

                  1. [Comment from banned user removed]

            3. [Comment from banned user removed]

              1. 5

                Everyone’s got something to hide, but it doesn’t mean that public resources should be encrypted.

                There is little benefit from https on many types of web-sites (from most small-scale personal sites to the larger public record ones), especially compared to the many drawbacks that https does bring.

                1. [Comment from banned user removed]

                  1. 5

                    Why should public resources be encrypted?

                    I’m generally on board with better security, but needless complexity also makes us more dependent on untrusted institutions. HTTPS is complex and does not prevent surveillance of browsing habits.

                    1. 10

                      Because if you remove encryption from everything that doesn’t necessarily require it, using encryption becomes suspicious.

                      1. 1

                        Well end-to-end encryption of private communications already is suspicious, because too many people think HTTPS is enough.

                        Instead of using Tor or PGP, we focus on requiring HTTPS for public resources, which neither protects private information nor makes actually helpful tools less suspicious.

                      2. 2

                        As Carkez0r says, to make really secret!!!!!! encrypted traffic less suspicious.

                        And so that your government can’t censor or rewrite wikipedia by man in the middling you (Or Google/Facebook/Twitter, other websites).
                        And so that script kiddies or companies can’t spy on your local network traffic or coffee shop/whatever traffic as easily.
                        And this.

                        1. 1

                          And so that your government can’t censor or rewrite wikipedia by man in the middling you (Or Google/Facebook/Twitter, other websites).

                          They can still do that if they are able to compromise certificate authorities, so we can’t assume HTTPS will protect us from that.

                          And so that script kiddies or companies can’t spy on your local network traffic or coffee shop/whatever traffic as easily.

                          Is it really harder to see which websites your visiting if you are using HTTPS? Your connection to the website is still visible isn’t it?

                          And this.

                          ???

                        2. [Comment from banned user removed]

                          1. 1

                            If I’m accessing a public resource, that part of my internet traffic which is hidden by HTTPS is precisely the information that’s already public. That’s my point.

                            The fact that I accessed that site is not a public resource, but that’s not hidden by HTTPS anyways. If we want that protection we should use Tor, which obviates the need for HTTPS (and is slowed down by HTTPS).

                      3. 1

                        Maybe you should revisit that last email: https://cr.yp.to/

                      4. 3

                        How does https help hide anything in this case? It transmits the hostname in plaintext, so anyone snooping on my connection knows I’m browsing lobsters, and even can track when and how often. And then the posts themselves are public, too.

                        I can see some advantages for https, but mainly in two circumstances: 1) transmitting private information, like connecting to a webmail client, where it keeps a snooper from getting the contents of the transmission, or 2) connecting to a very large, general-purpose​ site like Wikipedia, where leaking the bare fact that you’re loading Wikipedia is relatively low-information, and https successfully keeps a snooper from figuring out which articles you’re reading (and probably even editing, though they might be able to deanonymize editors if they log and correlate enough timestamps). But there are plenty of sites, like this one, where https fails to hide the only actually sensitive information: that I’m reading the site at all.

                        1. 9

                          How does https help hide anything in this case?

                          You’re posting on Lobsters as a logged-in user. Every request you make to the site sends your cookie. Do you want it sent in plaintext where anybody on the same open WiFi with you, and anybody who can see your packets en route, can steal it and impersonate you here? Do you think the site would be well-served by user accounts being hijackable?

                          Any site that has logins or cookies that it trusts at all needs HTTPS on every single request, and it needs the browser to know that it should never make a non-HTTPS request, even if redirected to do so by another web site (hence HSTS).

                          1. 1

                            Do you want it sent in plaintext where anybody on the same open WiFi with you, and anybody who can see your packets en route, can steal it and impersonate you here?

                            I don’t agree with the assertion that this site should be viewable without TLS, but this is not a very good argument against it. Having someone who goes thru the trouble of attacking the coffee shop wifi and poisoning the DNS server or whatever be able to impersonate me would probably be pretty amusing.

                            No one is depending on the content I post to this site for anything critical.

                          2. 4

                            But there are plenty of sites, like this one, where https fails to hide the only actually sensitive information: that I’m reading the site at all.

                            I think it’s unfair to say that’s the only sensitive information. Encryption has hidden your username, the topic you’re discussing, what you’ve contributed to the discussion… In this instance, we’re discussing something innocuous so you probably don’t regard this as sensitive info, but what if the discussion were something subversive and you wanted to say something that could land you in hot water?

                            1. 4

                              Protection against ad injection ranks up there for me. I like to know I’m looking at the web page as it was mostly* intended to be viewed.

                              *: third party ads can pretty much do whatever, but that was the site’s decision, not an interloper’s.

                              1. [Comment from banned user removed]

                                1. [Comment removed by author]

                                  1. [Comment from banned user removed]

                                2. 1

                                  Huh? There’s as much variety in what’s on here as what’s on wikipedia. Probably more, since we occasionally have explicitly political discussion. There are certainly threads I would and wouldn’t want to be seen taking part in.

                                  1. 4

                                    There are certainly threads I would and wouldn’t want to be seen taking part in.

                                    If that’s true, I wouldn’t post here at all, since https doesn’t add much additional protection. Which threads your username participated in are completely public, and because https leaks domain information and this site has fairly low traffic (no more than a handful of active threads and a dozen or so active posters in any hour), it’s not hard at all for someone who can intercept your web traffic to figure out which lobste.rs user you are, and therefore all threads you’ve ever participated in.

                                    It does provide some other benefits (someone else mentioned hijacking login cookies), but I don’t trust it at all for privacy protection, and think it’s dangerous to spread the myth that it solves that problem.

                                    1. 2

                                      Not hard perhaps, but much harder than it is over HTTP. Most security measures come down to increasing cost to attackers rather than making attacks impossible, at least in practice.

                                      1. 2

                                        On the other hand the idea that HTTPS solves the problems revealed by e.g. Snowden is a myth that way too many people believe, and promoting HTTPS to solve said problems takes energy away from actual solutions.

                                        1. 2

                                          Again, it increases the cost. To achieve the same level of surveillance that’s virtually free in a non-HTTPS world, the NSA et al has to do a certain amount of custom work (just in terms of understanding that site’s model of user accounts and so on) for each medium-sized host it wants to do the correlation thing on.

                                3. [Comment removed by author]

                                  1. 1

                                    I can’t help think web developers put themselves in this position by basing their whole technology stack on insecure, stateless HTTP, and then hacking a big complicated application development environment on top of it.

                                    The web technology stack sucks, and now end users suffer with stupid issues like these.

                                  2. 1

                                    I’m fine with that policy in general, but it is kind of sad that I can’t write a web browser or even scraper from scratch any more with just sockets, I now need an SSL library too.

                                  3. [Comment from banned user removed]

                                    1. 20

                                      AMP and gmail have almost nothing in common as products. It’s possible to like or dislike each independently.

                                      1. 10

                                        For example, instead of saying “AMP sucks!” one could say “AMP sucks! HTTP/2 can improve your mobile speed”.

                                        So first, the real speed issue with most sites that embrace AMP, is the content of the site, not the version of HTTP they use to transfer it.

                                        Secondly, Gruber specifically mentions that it’s possible to make fast-loading pages without succumbing to AMP, and references http://idlewords.com/amp_static.html.

                                        1. [Comment from banned user removed]

                                          1. 3

                                            Is there a study somewhere with statistics showing that

                                            Do you really need a study to show you that loading 5mb of javascript, third party tracking scripts and ad content will make your webpage slow?

                                            1. [Comment from banned user removed]

                                              1. 4

                                                Huh. Since when is JavaScript delivered to the browser “back-end” by any definition?

                                                Also, the transmission of the data is still a big issue regardless of the processing power to run it. Not everyone has 50mbit connections.

                                                1. [Comment from banned user removed]

                                                  1. 2

                                                    javascript is arguably an equal part of “the content” as text and graphics.

                                        2. 1

                                          I thought the main draw was “improves your search ranking” which is very hard to argue against/beat. So I thought that was what lead to the fast up take and would inevitably get most news/spam sites onto it, blah blah blah, the end :( For many companies saving bandwidth probably sweetens the deal, they are ignorant to the loss of control and worse loss of actual visits.

                                        3. 1

                                          Scott was flat out wrong about one key thing: you don’t need to host your AMP with Google; that’s just an option available to publishers if they want to save on hosting and bandwidth costs. You can totally host your own AMP content and many publishers (e.g. Time, Bustle, etc.) do just that.

                                          1. 0

                                            so amp is like putting a mini rocket on a elephant