1. 12
  1.  

  2. 12

    Safari and Firefox would have never implemented WebP if broken Chrome-only websites didn’t force them to. In both cases their motivation was just compatibility.

    WebP isn’t as good as advertised — it is stuck with the old VP8 codec. The codec became successful only after substantial upgrade in VP9, which WebP never got. So WebP doesn’t have HDR. It is ill-suited for wide gamut due to less-than-8-bit color depth, and is stuck with half-resolution color.

    I wish non-Google browsers held it back a bit longer, so we could skip WebP entirely and jump straight to JPEG XL or AVIF. The new codecs compress way better than WebP, and don’t have its quality limitations.

    1. 2

      It didn’t do everything, but it was still reasonably good at what it did do. If nobody had ever gotten on board with webp, the egress expense to my company from five years of shipping jpeg instead of webp would have been… quite noticeable.

      1. 2

        It would have been 15% larger if you’ve used MozJPEG at comparable quality.

        In all lossy codecs file size grows exponentially with quality, so it’s easy to mistake data loss caused by conversion for gain in compression. The same loss, responsible for the majority of “wow, it’s 3 times smaller!”, can be achieved by recompressing without changing the format.

        1. 1

          We’re seeing JPEG anywhere between 15% and 30% larger, depending on the content, at settings that make us happy (and yes, we use mozjpeg). Even if you take it as 15%, the difference adds up to a noticeable amount when you serve >1Gbit/s of images.

        2. 1

          It also has widespread hardware acceleration which is especially important for low-end devices.

          1. 1

            Browsers don’t use hardware acceleration for WebP even on devices that have hardware for accelerating VP8. This is because the hardware is designed for decoding a single long-lived stream, rather than many individual images in parallel. Setup costs and singleton bottleneck make it a perf loss for browsers.

      2. 7

        While the content of this article is interesting, it reads like a product placement for the podcast basically produced by Google (or maybe it’s just trying to optimise the pagerank of coywolf.com).

        This is a lot of text to just say “well basically Safari delegates image/videos decoding to the OS, and Mac and iOS don’t support AVIF, while WebKit has been supporting it for a long time now.

        1. 1

          They also have ampproject scripts on the main page here.

          1. 1

            Well, I’m sorry. I don’t have any connection with those companies.

            1. 1

              I never claimed you did. I think the link is interesting, I’m just sad about how it is written.

              1. 2

                I understand. It is indeed sad that most informative articles these days seem to have some commercial background.

          2. 3

            So in short: Browser uses operating system libraries to do image decoding. Like most every other application on MacOS.

            1. 1

              A weird choice by Apple I think, to handle images differently. I wonder what this means in the future, with new technology, and whether it’ll really start staying behind.

              1. 12

                It’s not that weird; makes it easier for them to implement the codecs in one dylib that’s shared by all applications (saving RAM) and can use whatever hardware-specific stuff they use on various devices to codec the bits without exposing those implementation details to the world.

                1. 7

                  Indeed. Image codecs are an attack surface (a few jailbreaks were thanks to TIFF decoder), so it’s better to have fewer, better tested copies.

                  1. 1

                    And applications can use it, too. In TenFourFox we used the OS X AltiVec-accelerated built-in JPEG decoder to get faster JPEGs “for free.”

                  2. 6

                    From a user perspective I think it would be weirder if they didn’t do this – “oh you can view this image of type X in Safari but not Preview.app, because the decoder statically linked in the former, but Preview.app can render type Y quickly because it leverages the core OS codec dylib but Safari doesn’t include a decoder for that one, or it only has some ultra-slow battery-eating software decoder someone contributed to WebKit”.

                    Doing it in one shared set of libraries for everything means that support is consistent, it’s easier to audit attack surface across the board (which Apple already struggles with, so I’d hardly encourage them to increase that surface), and optimizations only need to happen in exactly one place to leverage GPU features or custom IC blocks on their mobile SOCs. For a mobile browser you really want as few pure-software decoders as you can get away with for battery life reasons (more-so for video than stills, but things like HEIF are starting to be reasonably heavyweight to decode without hardware support on image-heavy pages).

                  3. 1

                    For some reason, the owner of the website has banned my IP (it is static, and I never visited this website before?). Could anybody write down a TL;DR in the comments? Thanks!

                    1. 6

                      TL;DR: Webkit implements an AVIF decoder which is available in GtkWebKitView. But Safari doesn’t use these, it uses WebKit just for the rendering but uses the OS’ decoders. Since MacOS and iOS don’t implement an AVIF decoder, Safari doesn’t support it.