1. 3
  1.  

  2. 4

    O(log₃₂(n)) ~= O(1)

    That is not a legitimate statement for the time using the O notation! O is talking about the asymptotic bound on an operation, and the O of log₃₂(n) as n goes to infinity is definitely more than the bound of a constant at infinity.

    T(log₃₂(n)) ~= T(C) for “reasonable” values of n. Sure. For n near the size of modern memory log₃₂(n) is under 10 and grows slowly indeed. This is correct and a valuable performance observation to make. However that’s a statement which must be made with the T notation. Abuse of the big-O notation to express this concept is flatly incorrect when the statement is about the exact time on a domain rather than the asymptotic algorithmic time.

    /rant

    1. 2

      Semi-rant/Observation: Big-Oh notation is - these days - (ab)used for all kinds of runtime ‘efficiency’ statements. Not that I agree, but at least it’s a start that “normal developers” (i.e. without a formal CS/Math training) are thinking in these kind of terms.

      edit:

      I was disappointed with the actual content of the article, was expecting something more…. concrete.

    2. 1

      I don’t see anything of value here, perhaps with the video it would be better.