1. 24

  2. 52

    Upvoted not because I think this is a good idea, but because I’m curious to get others’ opinions on it.

    This seems like a terrible, terrible idea. It’s yet another way of soft-forcing Google’s hegemony on the Web. Specifically:

    Badging is intended to identify when sites are authored in a way that makes them slow generally

    I’m pretty sure this actually means “badging is intended to identify when sites are authored in a way that makes them slow on Chrome…”

    And this isn’t like flagging a site that has an expired certificate or something. That is a legitimate security concern. This is making a value judgement on the content of the site itself, and making it seem like there’s something fundamentally wrong with the site if Google doesn’t like it.


    1. 21

      I’m with you.

      when sites are authored in a way that makes them slow on Chrome

      And further, “badging is intended to identify when sites are authored without using AMP” - or whatever else Google tries to force people into using.

      Seems like yet another way for Google to pretend to care whilst pushing their own agenda.

      1. 9

        They refer to two tools, one a Chrome extension (Lighthouse) that I didn’t bother to install, the other a website (PageSpeed Insights). I went for the latter to test a page that has no AMP or other “Google friendly” fluff and is otherwise quite light weight (https://www.coreboot.org) and got a few recommendations on what to improve: Compress some assets, set up caching policies and defer some javascript.

        If that’s all they want, that seems fair to me.

        (Disclosure: I work on Chrome OS firmware, but I have no insights in what the browser level folks are doing)

        1. 4

          Yeah, within 2 seconds of loading this link I was worried they were just pushing AMP. If they really are just pushing best practices then I’m cautiously optimistic about this change, and the fact that they didn’t mention AMP and instead linked to those tools gives me hope… but it’s Google, so who knows.

          1. 11

            I’m definitely still wary. I personally feel that Google sticking badges on websites they approve of is never going to end well, regardless of how scientific it may seem at the beginning.

            I really feel like there are major parallels to be drawn between Google and the rules of Animal Farm.

          2. 3

            But that’s not all they want. Google and Chrome are now positioning themselves to visually tell users whether or not a site is “slow” (according to whatever metrics Google wants, which can of course change over time). As with most Google things, it will probably look reasonable on the surface, but long term just result in Google having even more control over websites and what they can and can’t do.

            I would agree with you if it weren’t for Google’s long history of questionable decisions and abuses of their position as the (effective) gate keeper of the web,

          3. 2

            whatever else Google tries to force people into using.


            1. 3

              Not to mention that and SPDY being the basis of the “next” versions of HTTP (HTTP/2 and HTTP/3) which will no doubt be rabidly pushed for.

              1. 2

                Are there technical problems with HTTP/{2,3} or are you just worried because Google created them?

                1. 1

                  Specifically the fact Google created them. As tinfoil-rambling as it sounds, they already deal out a rather extensive spying/targeted advertising network, a possibly-manipulable gateway to information, the most popular video service, the most popular browser, some rather popular programming languages (Go, Dart), power a large portion of Web services (Chrome’s V8 in Node.js, that and Blink for Electron), many alternative browsers being Chromium/Blink-based, the AMP kerfuffle, ReCAPTCHA, and maybe the future protocols as to how their vision of the Web works.

                  They keep encompassing more and more parts of the Web, both technical and nontechnical, and people keep guzzling that down like the newest Slurm flavor. That’s what worries me the most.

                  1. 3

                    people keep guzzling that down like the newest Slurm

                    http/2 and http/3 are the result of a multi-party standardization process. It’s not SPDY with just a new label.

              2. 1


            2. 8

              Generally I’m against prejudging companies like this, but Google has earned it, and then some.

              Some sites that I use have Recaptcha. I use Firefox, and I can only pass the captcha if I log into Gmail first. Honestly, what kind of Orwellian horseshit is that?

              1. 4

                So, I can at least comprehend this one, even if I hate it.

                The whole point of recaptcha is to make it hard to pass unless you can prove you’re a real person. Being logged in to a google account which is actively used for ‘real things’ (and does not attempt too many captchas) is a really hard-to-forge signal.

              2. 1

                What’s an example of a site that loads fast in Chrome but slowly in Firefox?

                1. 8

                  Well, given what’s happening here…any site Google decides.

                  To be less pithy: this could be used as an embrace-extend-extinguish cycle. Sure, right now it’s all just general best-practices but what if later it’s “well, this site isn’t as fast as it could be because it’s not using Google’s extensions that only work in Chrome that would make it 0.49% faster, so we’ll flag it.”

                  I’m not saying Google is definitely going to do that, but…I don’t like them making this sort of determination for me.

                  It gives soft pressure to conform to Google’s “standards” whatever they may be. No website is going to want to have a big popup before it loads saying “This site is slow!” so they’ll do whatever it takes to have that not happen, including neglecting support for non-Chrome browsers.

                  1. 4

                    I don’t know if it’s still the case, but at one point YouTube deployed a site redesign that was based on a draft standard that Chrome implemented and Firefox did not (Firefox only supported the final, approved standard). As a result, the page loaded quickly in Chrome, but on Firefox it downloaded an additional JS polyfill, making the page noticeably slower to render.

                    1. 1

                      How about Slack video calls? Those are a “loads never” in my book. (Still annoyed about that.)

                  2. 20

                    Before I clicked, I thought to myself, “what is Google going to sneak in this time?”

                    Our long-term goal is to define badging for high-quality experiences, which may include signals beyond just speed.

                    Ah, there it is. Badges for the Google-approved, and warning badges for the rest.

                    1. 7

                      This certainly goes too far – there should be a concept of content neutrality, or experience neutrality: browsers are a conduit, and more importantly, the interface we user to interact with a growing number of services. There should be ethical guidelines around designing browser features, as it offers a growing surface for abuse.

                      1. 7

                        I see what you’re saying, and I wish we still used the old term “User Agent”, since they are meant to be agents for the user, not for the browser vendors.

                        1. 2

                          Ha, that’s a great way to put it!

                          1. 1

                            It is telling that the User-Agent string is basically a giant amalgamation of brands now :)

                      2. 14

                        Users can tell if a site is slow by using it. If the user experience is insufficient to notify the user, then the site is fast enough.

                        Even assuming that the initial authors of this proposal were well meaning, it’s a bad idea. As soon as the badging exists, some project manager is going to try to latch on to it as a way of promoting their internal projects, and there’s going to be a ton of pressure to abuse it, regardless of how well meaning the initial idea is.

                        This is not a game I want browser vendors to get into.

                        1. 5

                          Users can tell if a site is slow by using it.

                          Yep, my thought exactly. But that just adds to the idea that Google doesn’t really want to play Captain Obvious for Chrome users, but rather wants to control Web discovery even more.

                        2. 3

                          This is a bad idea in itself, but let’s be clear: a lot of these ecosystem-impacting changes are questionable because Google has a massive conflict of interest between Chrome and their business.

                          The ad-blocking changes, the mandatory forced Chrome login, this, the attempt to kill the url and others are just facets of the underlying conflict of interest.

                          1. 2

                            The linked PageSpeed Insights tool is pretty comprehensive, though I’d like to see a heavier penalty for total page size. I do wonder whether it would be a bigger win for users, and a much simpler implementation, if they just badged any site that attempts to use web notifications.

                            1. 1

                              Funnily enough it doesn’t seem to load without javascript. I guess that makes it faster?

                              1. 1

                                That doesn’t appear to be true.

                              2. 1

                                Firefox is pretty good again, for the record. Chrome isn’t as “mandatory” as it used to be.

                                1. 1

                                  The slowest things on the web are typically ads, trackers and other third party trash. Let’s assume Google is not being evil (obviously they are, but let’s pretend). Is shaming users of those third party things really going to make a difference? Wouldn’t it be better to go directly after the big wasters (like Google and Facebook) directly? The article says giving users “transparency” - will it put the blame where it belongs or just label the top level site?

                                  Just for lulz, I went to the pagespeed insights thing and punched in google.com

                                  Score: 91

                                  The Chrome User Experience Report does not have sufficient real-world speed data for this page. All pages served from this origin have an Moderate speed compared to other pages in the Chrome User Experience Report over the last 30 days. Reduce the impact of third-party code Third-party code blocked the main thread for 270 ms

                                  lol. All of my websites got a score of 100! probably helps that I don’t use google apis.

                                  1. 1

                                    One wonders how they intend to handle the implied ping to GOOG servers with each page to look up performance.

                                    1. 1

                                      There might be a way to do it similarly to how Safe Browsing v4 is implemented: a device-local cache combined with a lookup service for exceptions. https://blog.trailofbits.com/2019/10/30/how-safe-browsing-fails-to-protect-user-privacy/ is the best description I’ve found (even though it’s critical of the technique).