1. 16

  2. 3

    I wonder why Facebook couldn’t throttle the maximum throughput of requests to the same server. Seems like it would be the civil thing to do. There are not that many really big sites that enough people would link to cause 1 Gbps of genuine traffic fetching images, I would imagine, and these could be put in some sort of whitelist.

    Of course I’m just throwing ideas at the wall here.