I can’t really get behind just ignoring headers because some engineer feels like they aren’t useful anymore.
He doesn’t just “feel like”, he has a justified technical position, and I don’t see any counter arguments to any of his points.
The repeated use of “deprecation” without obvious links to the RFCs superceding those deprecations doesn’t help. Further, the entire point of the article is pretty clearly to help advertise Fastly (which presumably wants to go after some of Cloudflare’s market).
Like, it’s an interesting read, but I’m a bit concerned about people putting their services behind providers that sanctimoniously decide to break with RFCs because it might get them more business.
From the bit at the end it doesn’t sound like they’re doing anything to the headers by default? These are headers they recommend stripping out, and there’s an example at the end of how to strip out individual headers if you want to, but a site owner would have to actually do that to have any effect.
Yeah, I don’t really see the problem here.
Nobody’s forced to look at headers they’re not interested in, and the extras don’t hurt anything, except for using a bit of bandwidth.
This is a nice summary, thanks for sharing it. Combined with this tweet:
…I’m inclined to wonder how much time/bandwidth would be saved at larger sites if people cleaned these up, although I suspect that “size of HTTP headers” is not the worst bottleneck for most people.
I suspect the impact is minimal. It’s a few hundred bytes at worst, and the site is probably more affected by 3rd party adtech or unoptimized pictures.
Somewhat related, but even small changes to the request/response can have large impact on the bandwidth consumed.
From Stathat “This change should remove about 17 terabytes of useless data from the internet pipes each month”
Optimized Images alone would most likely save a lot more since they can save a lot more too. A recent google blog loaded a 8MB GIF image to show a few second long animation in a 250x250 thumbnail. 2 minutes in ffmpeg reduced that to about 800KB.
Imagine if people did this on sites with more traffic than some random google product announcement blog…
We need some form of “strict” mode which turns these all onto sane settings by default.
XKCD #927 😜
I don’t think this is really relevant - it wouldn’t be a competing standard, or a standard at all really, just a baseline to start from that could vary from server to server.
This has been posted on May 10, why is it tagged historical?
I think they’re referring to the fact some of the headers listed have historical explanations for their development/use/misuse