Threads for kamranahmedse

  1. 2

    Record a video on Transport Protocols (TCP vs UDP) for my youtube channel https://youtube.com/theroadmap and continue working on the redesign of https://roadmap.sh

    1. 1

      I love the presentation for your DevOps roadmap. Do you have any plans to add a roadmap for people who want to get into systems programming?

    1. 2

      I am working on a productivity/tasks/notes tracking application. I have tried everything out there but none of them are versatile enough. I am going to scratch my own itch and will see if I am able to scratch it for others too.

      1. 4

        Not to be the guy who suggest the solution I’ve found to a be a panacea for all worlds ails, but have you checked out Org-Mode in Emacs? Its incredibly flexible, in both the good and bad ways.

        1. 1

          Have you tried just using git issues? The cli

          1. 1

            Do you mean https://github.com/dspinellis/git-issue or https://github.com/stephencelis/ghi? CLI workflows tend to work well for me, but I never thought to look up cli issue managers.

            1. 1

              Yes. I just push issues to a remote server (in my apt). Then I use jrnl for notes. I can find stuff by tag. I’ll even reference jrnl tags in my git-issues. I use it for everything, not just coding projects.

              Edit: I use dspenelli’s. I also version my jrnls and push to a remote server.

        1. 8

          Nice visuals!

          A couple of remarks though:

          1. It is a bit weird to have notations such as O(3) = 3. Big-O notation is about asymptomatic behavior so it kind of only makes sense with a symbolic n.
          2. You do not mention at all the fact that these bounds are only up to a constant, and for n greater than given threshold, which for example means that matrix-multiplication algorithms with the best big-O are not the most efficient ones in practice. I guess this out of a desire to keep it simple but I’m afraid it could be misleading to some readers.
          3. Apart from the fact that your log is now in fact n/10 as iswrong saif, you also have a typo in your symbolic notation log_n, it should be $log_10(n)$.
          4. Speaking of complexity on problems whose input are an integer can be ambiguous. Strictly speaking, the “size” of the input is the log2 of the integer n, so for example, if you loop n times to print something, some may nitpick and say that you have an exponential complexity.
          5. Your plot of 2^n and log(n) seem a bit off, are they actual plots or just hand drawn? Log in particular seems very flat, but maybe it’ss just the way the plot is scaled.
          1. 1

            Thanks Arthur for the feedback. I mostly kept it as simple as possible for the sake of fitting it in an image and be easier to understand at the same time. I agree with all the points that you have mentioned and will address them in the associated article. Thank you!

          1. 9

            if you’re the author, don’t you have a non-twitter link?

            1. 3

              Not at the moment - I just made it and tweeted about it. However, I plan on putting it under “X in One Picture” series on https://roadmap.sh

              1. 2

                I posted it elsewhere, for posterity, or something:

                https://i.postimg.cc/k9m1CkpF/bigO.jpg

                1. 2

                  I also wondered whether tweets are endorsed here…

                  1. 4

                    I think most of the backlash against Twitter is that it’s a poor format for reading long form writing, content is usually light or unsubstantiated (rumors), it’s a redirect to a better primary source.

                    For the most part, Twitter links do pretty well in Lobster.rs: https://lobste.rs/domain/twitter.com

                1. 3

                  The logarithmic complexity examples on the right are wrong. If T(10) = 1, T(20) = 2, T(30) = 3, T(40) = 4, etc. the running time grows linearly with the input size, since if the input size doubles, the running time doubles. In logarithmic time (e.g. log 10 for simplicity), you’d expect T(10) = 1, T(100) = 2, T(1000) = 3, etc.

                  More in general, but I think this is a difference between big-O in theory and how it is used in practice: big-O provides an upper bound on the growth rate of a function. E.g. an algorithm that runs in linear time is also O(n^2). Of course, proving an O(n) upper bound (if possible) is more useful than proving an O(n^2) upper bound.

                  1. 1

                    Ah, good catch. I will fix it, thank you!

                  1. 1

                    While I used to love doing UML and Z-Notations early in my career and in the university and I still go with UML for my personal work or for when I have to provide formal/concrete design documents for anything. But I have never seen them formally used or enforced at any of the places at work - probably because startups where you have to iterate fast. Pen and papers or whiteboards with rough sketches are mostly the way to go.