1. 34
  1. 10

    These improvements are not likely to even be noticeable to curl or libcurl users. I still consider these optimizations worthwhile because why not do things as fast as you can if you are going to them anyway. As long as there’s no particular penalty involved.

    If only everyone had this opinion…

    1. 20

      They do! It’s called dependencies. For example, this base64 encoder has been in development for over 6 years, has extensive benchmarks and fuzzing. It’s actually scary how much effort can go in into such a simple task, but there you have it, free to use!

      1. 5

        I meant the attitude of optimizing even small parts of a project generally, rather than just having a fast base64 algorithm.

        1. 9

          not everyone has the luxury to be paid to work on a project like Daniel does. Time is a limited resource. Why spend it on something that is mostly fine?

          1. 3

            Faster code can have large benefits for many users, imaging if CloudFlare used curl internally, and handled a lot of base64 encoded data for every request they processed - not a particularly unreasonable assumption to make. A change like this could reduce their power consumption globally. It’s a contrived example, but I’m sure there are many examples where small optimisations can have a measurable impact on the world - HTTP/2 or 3 could probably be considered good examples of this, where the reduction in resources needed for intermediate routers etc. might have a clear impact on the internet’s energy usage as a whole.

            I also think the code that Daniel has ended up with ere is also clearer and easier to maintain, which also benefits anyone needing to audit it.

            1. 7

              Well then cloudflare should pay someone to make it faster and contribute it back. I don’t think any OSS developer owes big for profit organizations anything.

    2. 6

      The scale of this change is fascinating. At 29x faster encoding and 4.5x faster decoding for something done maybe once per HTTP request, the performance difference is imperceptible to the individual user. Still, across humanity, given the ubiquity of curl, the universe is spending appreciably fewer CPU cycles and wall clock time doing base64 operations. I wonder if it could be measured in years per day.

      1. 3

        Is it even used once per request? I can’t think of a common base64 usage in http

        1. 5

          Off the top of my head, it’s just used for basic auth. After another check, it seems the Content-MD5 header is also base64 encoded. Not exactly common (and it looks like curl doesn’t typically send a Content-MD5 on POST or PUT requests).

      2. 3

        It would be nice to see benchmarks on this improvement.