1. 3

  2. 1

    Since you presumably cache things because they cost CPU, database reads, or money, doesn’t it make sense to lock while caching?

    Caching is almost entirely about reducing latency, in which case a lock is not helping and probably hurting. Given that things sometimes break, this can increase the latency of all reads of a cached value to 32 seconds (by default). On top of that, this sounds like it serializes all cache access to a particular value, which seems to defeat the caching.

    On a tangential note, I’ve never seen a language or framework for it (RoR) that is so obsessed with caching. Caching is certainly required at some point, however it seems like one cannot deploy a Ruby app without a Redis, at this point. I just don’t see this with other languages, like Erlang or even Java. What bothers me about this is that Ruby and RoR and friends are often sold as easy to use tools, but they come with this operational cost of caching. I think the community has just accepted this and is fine with it, but for someone like myself it seems like a cost I don’t want to pay unless I really have to.