How fast is fast and how fast is fast enough? Terrible internet causes me acute pain all the time while traveling, but I have no idea what numbers to shoot for when developing.
A good starting point is being faster than your competitors. Depending on your industry/market that may set the bar very low or extremely high though, so YMMV.
Generally, for sites my team works on, I set targets at a minimum of Speed Index < 5000 on Slow 3G, < 3000 on Cable/DSL – using https://www.webpagetest.org/ to verify and taking the median of 5+ runs.
If you want to test a site on Slow 3G on Slow Android phone to see what a painful experience would look like, you can use https://www.webpagetest.org/easy
You can also use the Audit panel in Google Chrome’s developer tools, this integrates Lighthouse (https://developers.google.com/web/tools/lighthouse/) which includes metrics like “Time to Interactive”, “Time to First Contentful Paint” etc which are useful for determining what the experience will be for your users.
Good god, this still seems absolutely insane to me.
Would you blindly run a binary served automatically from any of these vendors on your production web server? No? Then don’t inject their code into your webpage! (or at least, not without a checksum)
This is an important point – when adding a 3rd party to your site it’s critical to evaluate the quality of their service, they could severely harm your site.
Ultimately the control does lay with the site owner though – once you’ve identified problems, you can remove a tag, generally with minimal effort.
You can remove the tag, but said tag may have done irreparable damage: what if it steals user sessions or clicks certain buttons (such as delete) when logged in? it could destroy user data.
You’re absolutely right - as part of reviewing you should be checking the tag is from a legitimate trustworthy company.
My personal opinion is that it’s highly unlikely a commercial company would do something so malicious as to delete data on purpose.
Of course it could happen by accident, but other than avoiding all 3rd party scripts, which is inherently impractical for most significantly-sized sites, there is nothing you can do to avoid that.
Best method I’ve seen is to use scripts from vendors who won’t change them suddenly, then you can make use of subresource integrity.
I’m not too worried about companies acting maliciously; they already make their money by gathering user data. I’m more worried about someone attacking the server and causing malicious scripts to be sent.
Aside from my usual work, I’m writing a blog post about how to handle multi-lingual ecommerce sites (catering for switching both languages and currencies) without causing yourself HTTP caching or SEO headaches.
The focus is mainly on the high-level implementation, however the example is building such a solution in Rails, despite this, it should be lightweight enough to replicate in other languages and frameworks.