Firefox’s more detailed position has been along the lines of “we’ve added it behind a flag in nightly but we’re not rushing to roll it out because it’s not clear that it’s a winner and it kinda sucks that no memory-safe decoder exists yet”. Hm, I wonder what Apple is doing for the decoder – probably just libjxl sandboxed like other codecs?
From Chromium behind a flag, with a flagged-feature deadline that was extended a couple of times without activity in between. So where’s the ecosystem enthusiasm?
Had Chromium simply enabled the feature, people would be screaming murder on how Google unilaterally defines what enters “the web platform”, and how that was only done to complicate matters for new implementations by adding features upon feature with the sole purpose to build a moat.
JPEG XL was a collaboration between a lot of different parties to be something without royalties, efficient, open source, and could losslessly compress the beloved JPEG. In asking for this flag to be added back you had team members from GIMP and Krita fighting alongside Adobe and Instagram–.jxl is closer to universally liked than hated/lukewarm. Compare that to WebP which was Google’s creation and was foisted upon the net while having serious flaws vs. JPEG.
Adding that flag back does nothing to bring JXL to the web. Implementing JXL, e.g. with polyfills, would - didn’t happen. So in the end, “put the flag back” is nothing but “do busywork for no benefit at all.”
Polyfills would be massively inefficient for decoding images compared to a baked in C library–especially if there was not an effort to do any of the decoding with the GPU. Because of the inefficiencies of shipping a decoder and doing that work, no one would ever use it because it’s cheaper, faster, simpler to using image formats with native support. AFAIK the ldbjxl library is pretty stable and fairly optimal–all that was needed was to test it out, without a flag, in the likes of beta/canary for native support. Firefox should be doing the same with beta/dev.
Polyfills would be massively inefficient for decoding images compared to a baked in C library
They would prove ecosystem interest in a way some random “I’d totally use it, any way, just you wait - and now bring back the feature hidden behind a flag!” comment on a vaguely known website never could, though.
The idea of polyfills is to provide features when the browsers don’t do it natively yet, in the hope that the polyfill will become obsolete in due time.
Loading extra megabytes of polyfills is an unacceptable cost, it would only be worth it if JXL was ridiculously more efficient than anything else. It’s also really hard, maybe impossible, to fully support JXL properly with polyfills due to the limitation of web browser APIs.
When page speed affects your bottom line and SEO along with any sluggishness if needing to draw a lot on canvas, it will become a balancing act on these pesky outside factors and not just the merits of the technology. A lot of big companies have chimed in on their own tests (Shopify, et.al.) that said it would be of benefit to them where WebP/AVIF have missed marks (like in photography), but haven’t shipped because the polyfill adds too much slowdown.
Personally, I have been adding it to <picture> elements (no polyfill) to try to show support. My browser, Librewolf, can decode it.
I don’t think they’re going to go away from HEIC. It is the “raw” format of the iPhone camera now. The new Safari also added it as a supported image format.
It’s weird, but kinda smart: what if the OS sandboxed Electron for you?
By that I mean; rather than spend all the memory overhead of running multiple (effective) browser processes, we just admit a browser-process-subsystem as a first-class citizen
I know my inner curmudgeon feels like this is an outrage, but it’s already what I do for myself, keeping most of these as tabs (and Slack et al shame me for it every time, because I refuse to “install the app”)
Sure, they have their proprietary browsers with their quirks, but would it be much worse than what exists?
Yes, I know it’s not a new idea (webOS, anyone?) but maybe with some actual backing this could be real.
There are already a number of Mac apps that will create tiny little wrapper apps presenting a single website. The wrappers are tiny because they use the built-in WebKit framework instead of inlining an entire Chrome browser.
I think this is a great idea and I hope the Safari implementation fixes some of the problems — e.g. I hope it ensures that when you click a link to such a domain that it opens in the app, not in Safari.
Succeeded by KaiOS (partially closed) & Capyloon. RIP FxOS for being ahead of its time; imagine an alternate timeline where grammar school students are required to purchase Fx Books for school instead of Google’s offering.
Very curious to see if “web apps” can replace many of the electron apps I have lying around.
I used safari for a period & really loved it, by far the fastest & best UI for end-user. Unfortunately found dev tools lacking though I may have just not adjusted to the learning curve, thinking about giving it another go.
If only we had a distributed VCS that wasn’t tied to a single website, where you could have your own instance of a repo and somehow “push” or “pull” commits to other repos. Someday…
It’s very interesting that Apple’s thumbing its nose at Google by including in Safari the JPEG XL format which Google removed from Chromium.
Firefox might be convinced to add it now that WebKit has, as their position up till now was “neutral”.
Firefox’s more detailed position has been along the lines of “we’ve added it behind a flag in nightly but we’re not rushing to roll it out because it’s not clear that it’s a winner and it kinda sucks that no memory-safe decoder exists yet”. Hm, I wonder what Apple is doing for the decoder – probably just libjxl sandboxed like other codecs?
[Comment removed by author]
From Chromium behind a flag, with a flagged-feature deadline that was extended a couple of times without activity in between. So where’s the ecosystem enthusiasm?
Had Chromium simply enabled the feature, people would be screaming murder on how Google unilaterally defines what enters “the web platform”, and how that was only done to complicate matters for new implementations by adding features upon feature with the sole purpose to build a moat.
JPEG XL was a collaboration between a lot of different parties to be something without royalties, efficient, open source, and could losslessly compress the beloved JPEG. In asking for this flag to be added back you had team members from GIMP and Krita fighting alongside Adobe and Instagram–
.jxl
is closer to universally liked than hated/lukewarm. Compare that to WebP which was Google’s creation and was foisted upon the net while having serious flaws vs. JPEG.Adding that flag back does nothing to bring JXL to the web. Implementing JXL, e.g. with polyfills, would - didn’t happen. So in the end, “put the flag back” is nothing but “do busywork for no benefit at all.”
Polyfills would be massively inefficient for decoding images compared to a baked in C library–especially if there was not an effort to do any of the decoding with the GPU. Because of the inefficiencies of shipping a decoder and doing that work, no one would ever use it because it’s cheaper, faster, simpler to using image formats with native support. AFAIK the ldbjxl library is pretty stable and fairly optimal–all that was needed was to test it out, without a flag, in the likes of beta/canary for native support. Firefox should be doing the same with beta/dev.
They would prove ecosystem interest in a way some random “I’d totally use it, any way, just you wait - and now bring back the feature hidden behind a flag!” comment on a vaguely known website never could, though.
The idea of polyfills is to provide features when the browsers don’t do it natively yet, in the hope that the polyfill will become obsolete in due time.
Loading extra megabytes of polyfills is an unacceptable cost, it would only be worth it if JXL was ridiculously more efficient than anything else. It’s also really hard, maybe impossible, to fully support JXL properly with polyfills due to the limitation of web browser APIs.
When page speed affects your bottom line and SEO along with any sluggishness if needing to draw a lot on canvas, it will become a balancing act on these pesky outside factors and not just the merits of the technology. A lot of big companies have chimed in on their own tests (Shopify, et.al.) that said it would be of benefit to them where WebP/AVIF have missed marks (like in photography), but haven’t shipped because the polyfill adds too much slowdown.
Personally, I have been adding it to
<picture>
elements (no polyfill) to try to show support. My browser, Librewolf, can decode it.I wonder if that’s just for the web, or will they eventually move away from HEIC? Getting rid of HEIC would be a huge win for royalty-free codecs.
I don’t think they’re going to go away from HEIC. It is the “raw” format of the iPhone camera now. The new Safari also added it as a supported image format.
Well, DNG is the raw format of the iPhone camera ;-) HEIC is arguably the “native” format, though, yes.
I think quite a few cameras are using it now too.
It’s weird, but kinda smart: what if the OS sandboxed Electron for you?
By that I mean; rather than spend all the memory overhead of running multiple (effective) browser processes, we just admit a browser-process-subsystem as a first-class citizen
I know my inner curmudgeon feels like this is an outrage, but it’s already what I do for myself, keeping most of these as tabs (and Slack et al shame me for it every time, because I refuse to “install the app”)
Sure, they have their proprietary browsers with their quirks, but would it be much worse than what exists?
Yes, I know it’s not a new idea (webOS, anyone?) but maybe with some actual backing this could be real.
Yes, I’m also aware that satire eventually becomes reality (https://www.destroyallsoftware.com/talks/the-birth-and-death-of-javascript)
There are already a number of Mac apps that will create tiny little wrapper apps presenting a single website. The wrappers are tiny because they use the built-in WebKit framework instead of inlining an entire Chrome browser.
I think this is a great idea and I hope the Safari implementation fixes some of the problems — e.g. I hope it ensures that when you click a link to such a domain that it opens in the app, not in Safari.
like FirefoxOS? :-)
Succeeded by KaiOS (partially closed) & Capyloon. RIP FxOS for being ahead of its time; imagine an alternate timeline where grammar school students are required to purchase Fx Books for school instead of Google’s offering.
Wouldn’t that just be ChromeOS?
It’s adding a feature of ChromeOS to an OS that already has its own mature native-app ecosystem.
Features I’m excited to see:
Very curious to see if “web apps” can replace many of the electron apps I have lying around.
I used safari for a period & really loved it, by far the fastest & best UI for end-user. Unfortunately found dev tools lacking though I may have just not adjusted to the learning curve, thinking about giving it another go.
The popover attribute will enable a whole new generation of blogs that whine about how they don’t use any JavaScript, so why should you.
Finally created to see web app support after year. I wish Lobsters was minimally a PWA with a web manifest file so it was more friendly on Android.
Get involved on GitHub. It worked for dark mode :)
Cool.
Not cool.
If only we had a distributed VCS that wasn’t tied to a single website, where you could have your own instance of a repo and somehow “push” or “pull” commits to other repos. Someday…
You can probably just email pushcx your patches.
You could also get involved to encourage moving the project someplace else.
Is there an ongoing effort?
Be the change you want to see.
It’s been exhausting as is, I don’t know if I have the mental effort for this case too right now. 😔