1. 12
  1.  

  2. 1

    Neat! I hadn’t heard of HAMTs before even though I now see they’re used as the builtin hash implementation in a number of programming languages.

    1. 1

      wow…I’m actually surprised at how slow a lot of things are. interesting

      1. 4

        Big-O time complexity really doesn’t paint a full picture of how data structures and algorithms perform in production software.

        Details like cache lines, data locality, branch prediction and boxed terms do a great job of making for very surprising benchmarks where implementations with poorer Big-O complexity perform faster than implementations with better Big-O complexity.

        The HAMT-based maps in Erlang/Elixir were definitely welcomed when they were introduced, but plenty of software was written in Erlang and run in production for years, without O(lg n) or O(1) random access into sets of KV pairs.

        1. 2

          Which of the data structures do you think are “slow”?

          List performance is as expected for singly-linked lists, tuples as for fixed-sized arrays, etc.

          1. 1

            It’s like you shouldn’t use ‘slow’ software to write performance sensitive things like globally distributed chat software… Or something.

            1. 1

              I should be clear, I love Elixir. I’m not disparaging it. I was surprised by the time complexity of some things is all – and I’m aware it doesn’t really translate to much IRL.

            2. 1

              It is not unusual to have a sorting algorithm that switches to different sorting algorithms depending on information about the argument… Python does it

              In practice algorithms with super linear complexity can behave very well under small inputs