1. 5

    Pointing out the same type of error multiple times can be helpful. At least, I am not offended by a reviewer doing that. I consider it as a checklist that I haven’t forgotten something. You can learn to give AND receive feedback in a constructive way.

    Of course, humans are different and my opinion isn’t better than the author’s. That said I don’t think that it encourages an open & calm discussion to label such practices as “toxic”.

    (I am not a native speaker but toxic seems like a really strong negative label.)

    1. 3

      I’ve tried to use Darktable (and Rawtherapee) for a few times without success. Both have tremendous count of features but compared to Lightroom they lack of simple UI. I wish there will be option for only one panel with reasonable set of settings available. I feel that amount of features there for casual photographer like me is too much. BTW, I loved how old Google Picasa worked - Darktable/Rawtherapee could think about that tool.

      1. 2

        Oh, I can fully recommended giving it one afternoon with some video tutorials. After that, you should be comfortable with the basics functionality. If I remember correctly, you can customize the interface to show the panels that you like.

        I switched to Lightroom a while a go, mostly, because my most powerful machine was a Mac with Mac OS. I had trouble finding things in lightroom for a while and thought that darkroom is organized more logically.

        1. 1

          I’ve used Darktable for many years now and I find the UI to be really great. It is preciously what you need without being too complicated. It is geared towards professional use. Use a couple of hours with Darktable and you’ll be right at home. It is easy to configure a set of features and only use those.

        1. 16

          I have fought against this myself. It’s hard. What I found really helpful was to decouple my self-worth from my job. Nothing related to programming skill is related to peoples’ inherent value; almost nothing in programming is a moral decision (except being willing to say “this is a job I will not do”). Personally I have found it easier to find this from philosophical, theological, and moral texts than from self-help books; tastes vary.

          If you are interested, I recommend these books:

          • The Miracle of Mindfulness by Thich Nhat Hanh
          • Xunzi (translated into English by Hutton)
          • Meditations by Marcus Aurelius
          • The Confessions by St. Augustine
          • The Seven-Storey Mountain by Thomas Merton

          Several of them are “religious” (Mindfulness is Zen Buddhist and Augustine and Merton are both Catholic), but there is a common thread of self-critique and examination that runs through them that I found really valuable.

          1. 8

            What I found really helpful was to decouple my self-worth from my job

            I will go one further, and say what really works for me (and what I’m constantly having to practice) is to decouple my self-worth from my own intelligence or talent, and to go on to admit that whatever I happen to be suited to, there’s very little credit I can take for it. As a programmer, I am entirely reliant on prostheses: documentation, yes, but also unit tests, type systems, and mathematics—these are all useful to me, and everyone else, specifically because they help fill in the gaps where my reasoning ability (supposedly the thing I am proud of as a programmer) is deficient.

            Nevertheless my mind does rush to judgment constantly; it’s been fine-tuned to always find a way to set myself apart from whoever I’m looking at. At then end of the day, when it comes to this profession, pretty much every single one of us would benefit from approaching it with a huge degree of humility.

            1. 11

              Nevertheless my mind does rush to judgment constantly

              Programming encourages this, because code has to be right. Pointing out mistakes is something we frequently have to do as a result.

              Most of ‘the real world’ doesn’t need the same kind of correctness. Businesses run on approximation and best efforts.

              This is (to my mind) the most significant ‘Déformation professionnelle’ of the programmer. In the rest of society, maintaining a relationship is (often) more important than pointing out a mistake.

              1. 4

                …every single one of us would benefit from approaching it with a huge degree of humility.

                This is the key point, I think. Humility helps you see where you’ve made a mistake; where you can improve; where you might be entering an area of weakness. It also helps you relate to your coworkers and colleagues. If you can come to recognize things as opinions, rather than subjects of Objective Truth, that don’t really matter very much, then a lot of friction is removed.

                1. 4

                  Humility for me is really hard when met by arrogance. Arrogance provokes arrogance in myself.

                  1. 2

                    That is very true. It’s easy to get offended and act arrogant/negative in return. This is probably my primary failure mode! Still, it’s just something to recognize and work on.

              2. 2

                Thanks for your comment, especially the small sentence “it’s hard.”

                I struggle with the concept of decoupling my self-worth from work. Sometimes, I think that this is the right path. Then I cannot perceive how something on that I spent so much deliberate time & energy should be irrelevant for defining my self? I am currently digging into mediation, maybe your recommendations provide further guidance. Thanks.

                1. 4

                  Consider these ideas.

                  1. Imagine that there is an economic down-turn and you are not able to keep your job. You are forced to work, to make ends meet, as a cook in a restaurant.
                  2. Imagine that you were struck by a car while crossing the street. You have a head injury and, while you are able to walk and talk, are never able to work as a programmer again.

                  In either case, should your self-worth be damaged? I would say no. How we treat our personal obligations is what defines us as people, not our work. If you are meeting your personal obligations as best you can in the circumstances—treating the people around you well, taking care of your children, generally making the world better—then you’re doing fine.

                  I really like Xunzi for this, because he sets out his goals plainly.

                  The gentleman is the opposite of the petty man. If the gentleman is great-hearted [confident] then he reveres Heaven and follows the Way [i.e. follows social rituals and educates others]. If he is small-hearted [shy] then he cautiously adheres to yi [moral standards] and regulates himself. If he is smart, then with enlightened comprehension he acts according to the proper categories of things. If he is unlearned, then with scrupulous honesty he follows the proper model. If he is heeded, then he is reverent and reserved. If he is disregarded, then he is respectful and controlled. If he is happy then he is harmonious and well-ordered. If he is troubled, then he is calm and well-ordered. If he is successful, then he is refined and enlightened. If he is unsuccessful, then he is restrained and circumspect.

                  In short, there is a way to be a “gentleman” (or “sage”) in every circumstance. Of course, nobody is perfect, so it’s better to be seen as a “process” than an end point.

              1. 27

                I first learned programming when I was 13. My dad showed me visual basic and I was hooked. I got really excited until some experts told me VB wasn’t “real” programming and I had to learn C++ if I wanted to be a “real” programmer. So I learned C++, which was such an awful experience it made me hate programming until I learned PHP eight years later. Sure, PHP is a bad language, but I could finally explore ideas in CS instead of fighting with memory management to do even the most basic things.

                I’ve had friends who started learning C for the same reasons, not because they actually wanted to know C, but because toxic experts told them they had to to count as “real” programmers. I’ve argued with people who said you’re a bad programmer if you use Python 3 because it has a higher startup time. I’ve interviewed at places that used “Do you know the tortoise-hare algorithm” as a first-order filter. We’re awfully good at finding ways of separating ourselves from the unwashed masses.

                For some reason it’s always what you already know that’s used as a gatekeeper. Nobody ever seems to think that you need to know assembly in order to be a “real” programmer.

                Of course none of this applies if the gatekeeping knowledge is actually relevant to the task at hand. It’s totally fine to filter on “do you know cycle detection algorithms” if you’re regularly encountering cycles.

                1. 16

                  I remember feeling pretty hurt when I discovered Dijkstra’s quote:

                  It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration.

                  given that BASIC had been my first language. Of course in context I realize he was being glib, but as a kid I was genuinely worried that this inadvertent choice had “mentally mutilated” me.

                  I like Dijkstra’s writing but I do think there’s a macho real-programmers tradition that is inspired by that kind of thing.

                  1. 12

                    The thing that helped me with Dijkstra is realizing that he thought just about everyone was an idiot. Not saying I’d enjoy interacting with him nonetheless, but easier not to take anything personally after realizing that.

                    For example you mention the “macho real-programmers tradition”, but he thought lots (most?) of them were idiots too, since he came from more of the academic formal/theoretical/algorithmic approach to programming and thought of a lot of people in the “real industry programmers” kind of tradition as undereducated rubes.

                  2. 4

                    Just like you I started programming very young, did BASIC, then VB , then C++. Unlike you I was fortunate that there were no expert around me and I kept programming. I often looks back and wonder if I was one of the asshole experts stultifying people’s desires to learn. If I was, I deeply apologize to any affected. Now I just realize I know Jack shit.

                    1. 3

                      It is very hard for me to be non-judgemental because I am so attached to programming. My girlfriend learned PHP and typical 90s PHP problem patterns with it. I couldn’t refrain myself from telling her about how bad I think that is. It didn’t improve anything about the situation, I realized that in advance and still I couldn’t hold back.

                      Silly humans.

                    2. 3

                      There are languages that offer the ease of PHP or VB without the major design flaws.

                      1. 12

                        I think the key might be friendly guidance instead of gatekeeping and snobbery. Like, “hey, you know all this pain you’ve been having with keeping this code from breaking every time you make a change? I find some features in this language I really enjoy help a lot with this.” (although as a kid, the speed of basically every other language was the big motivation to move away from BASIC)

                        1. 7

                          I’ve had the good fortune to occasionally advise teenagers on their first steps in learning programming. I’ve found that they run into people in programming-language communities who are jerks to them almost immediately. I’m sure some of those people realize they’re talking to a child and are jerks anyway, and some do not realize.

                          I agree with you: What works is friendly guidance in the form of offering my own experience rather than criticizing somebody else’s. I also find that I need to give a lot of context on the fact that there are people who act in very toxic ways out there, and that it’s probably best to understand that as their problem rather than yours.

                          1. 3

                            This probably says more about me than anything else, but I ran into some people like that when I was ~14 and first getting on FidoNet, but it actually worked for me rather than pushing me away from programming for some reason? Was opinionated people who thought low-level, fairly old-school styled C was basically the only legitimate programming, and told me to do X and Y if I wanted to be a “real programmer”. I think the fact that they seemed confident that they knew what they were doing, plus (this part’s fairly important) told me what specifically they thought I should be doing, with pointers to free resources, helped me cut through the decision paralysis of feeling that I couldn’t do anything because I didn’t know what I was “supposed” to do.

                    1. 7

                      I know this happens with every good word (see: synergy, hate, hack,…) but can we please stop labeling everyone who’s attitude we don’t like “toxic”? It’s gone too far. There was a time when “toxic” meant “poisoning those around them”, which is close to the traditional meaning, but now it’s just any supercilious (that was the word you wanted) person being, err supercilious (or pedantic and a bit rude) on blog comments. As usual, in the grab for hyperbole and eyeballs we’ve lost some more nuance in our language.

                      1. 11

                        I actually agree from a different angle. People aren’t toxic. Things that people do are toxic.

                        The reframing is important because so many people are both abused and abusers, in different contexts, and at different times. And applying a term like that to a person as a whole rhetorically denies the possibility they could ever change.

                        I don’t agree that using “toxic” to describe people’s attitudes is wrong in principle though. But yes there’s a line, and it’s only for serious stuff. It’s important not to dilute it.

                        1. 5

                          If the author had described what he disliked about the pyon’s comment without using the word toxic, it might have been easier for pyon to change his/her behavior. I sympathize with the author but in a sense he makes the same mistake by conflating a really good point with negative personal attributions.

                          1. 1

                            Point taken, yes.

                      1. 1

                        All I could think of while reading this was “wow, Java is or used to be really problematic.”

                        1. 3

                          Can you be more specific?

                          At least the optimization itself would be applicable to many programming languages: Setting a good initial size for a container.

                          1. 0

                            Amen. Makes me happy millenials killed it (along with its bastard companion XML).

                            1. 15

                              Java is alive and well. I have no idea how you come to the conclusion it was killed.

                              1. 0

                                Java is dead in the sense C++ is dead. Once dominant, now one of the languages used by increasingly old guard. Of course there are still projects in Java, and even likely some people coding applets for old times sake.

                                But you can ignore Java at this point without handicapping your career.

                                1. 6

                                  I am working for start-ups in the Bay Area and I can tell you that java is very much alive and well and used for new things every day. Nobody writes GUI apps in it anymore, but in the back-end it is widely popular.

                                  1. 3

                                    People do tons of new projects in C++ too. Still nothing like its heyday mid-90s.

                                  2. 3

                                    But you can ignore Java at this point without handicapping your career.

                                    I agree with you, but I can’t think of a language that’s not true of. There are a lot of language ecosystems that don’t overlap much if at all - Java, Ruby, Python, .NET, Rust, Erlang…

                                    1. 3

                                      I think if you don’t have some level of understanding the level of reasoning that C works at, that can be a bit of a handicap, at least from a performance standpoint. Though that’s less of a language thing than it is about being able to reason about bytes, pointers and allocations when needed.

                                      1. 0

                                        That wasn’t true say 15 years ago. Back then if you wanted to have professional mobility outside certain niches, you had to know Java.

                                        1. 2

                                          I’m going to respectfully disagree. 15 years ago, you had Java, and you had LAMP (where the “P” could be Perl, PHP, or Python), and you had the MS stack, and you still had a great deal of non-MS C. After all that, you had all the other stuff.

                                          Yes, Java may have been the biggest of those, but relegating “the MS stack” to “certain niches” perhaps forgets how dominant Windows was at the time. Yes, OSX was present, but it had just come out, and hadn’t made any significant inroads with developers yet. Yes, Linux was present, but “this is the year of Linux on the desktop” has been a decades-long running gag for a reason.

                                          1. 1

                                            MS stack was in practice still C++/MFC at the time, and past its heyday. The dotcom boom dethroned desktop, Windows and C++ and brought Java to prominence. By 2000, everyone and their dog were counting enterprise beans: C++ was still massive on Monster, but Java had a huge lead.

                                            Then Microsoft jumped ship to .NET and C++ has not recovered even since. In mid-90s you were so much more likely to land a job doing C++ vs plain C; now it’s the opposite.

                                            My karma shows I hurt a lot of feelings with my point, but sorry guys Java is in visible decline.

                                            1. 1

                                              Oh, my feelings weren’t hurt, and I don’t disagree that Java is in decline. I merely disagree with the assertion that, 15 years ago, you had to know Java or relegate yourself to niche work. I was in the industry at the time. My recollection is that the dotcom boom brought perl and php to prominence, rather than java.

                                              Remember that java’s big promise at the time was “run anywhere”. Yes, there were applets, and technically servlets, but the former were used mostly for toys, and the latter were barely used at all for a few years. Java was used to write desktop applications as much as anywhere else. And, you probably recall, it wasn’t very good at desktop applications.

                                              I worked in a “dotcom boom” company that used both perl and java (for different projects). It was part of a larger company that used primarily C++ (to write a custom webserver), and ColdFusion. The java work was almost universally considered a failed project due to performance and maintenance problems (it eventually recovered, but it took a long time). The perl side ended up getting more and more of the projects moving forward, particularly the ones with aggressive deadlines.

                                              Now, it may be that, by 15 years ago, perl was already in decline. And, yes, java took some of that market share. But python and ruby took more of it. A couple years later, Django and Rails both appeared, and new adoption of perl dropped drastically.

                                              Meanwhile, java soldiered along and became more and more prominent. But it was never able to shake those dynamic languages out of the web, and it was never able to make real inroads onto the desktop. It never became the lingua franca that it wanted to be.

                                              And now it’s in decline. A decline that’s been slowed by the appearance of other JVM languages, notably scala (and, to a lesser degree, clojure).

                                  3. 6

                                    Incidents of Java in my life have only increased as my career has, I’m quite certain Java is far from dead and we’re all the worse for it. I’ve even worked for “hip” millennial companies that have decided they needed to switch to Java.

                                    1. 5

                                      Java is still alive and kicking, having a language that has proven itself to be good enough with a rich ecosystem with different vendors having implemented their own JVM, we’re all the worse for that because?

                                1. 2

                                  First time that I have read about the plan to support build stages in docker. I am very happy that they finally make that it easy to create small docker images without unnecessary remnants of building your app.

                                  1. 6

                                    I like Rust a lot, and I’m familiar with more experimental approaches like effect system monads, but it really does seem like monad transformers, despite their shortcomings, are a practical and fairly elegant solution to these exact problems.

                                    If your code might fail (like opening a non-existent file), just throw an ExceptT in your transformer stack. If you’re interfacing with code that returns a Result (or Either) type, just write a function that lifts m (Either e a) to T m a where T is your transformer. Boom, you’ve just nuked all your unwraps, try!s, question marks, and other boilerplate.

                                    Am I falling victim to the blub paradox here? Are there some superior ways to address the kind of stuff the OP is talking about?

                                    1. 9

                                      Can you elaborate with an example?

                                      Personally, I don’t think that you need a lot of unnecessary error handling code when using the ? operator. Not much to eliminate from.

                                    1. 7

                                      As is typical for rust beginners, the author struggles with rust’s error handling. I love rust’s error handling but you need to know a lot about the language to use it right. The author also expects to be guided by the compiler to use it correctly which it does not do.

                                      Somewhere else I saw the following suggestion for teaching error handling in different levels of sophistication:

                                      • unwrap/expect
                                      • Result<…,Box> with try! and ? (appropriate for the author’s goals)
                                      • error_chain library
                                      • manual implementation of errors
                                      1. 5

                                        That’s basically what the current book does, sans error_chain. It just requires a lot of explanation if you are starting from first principles.

                                      1. 3

                                        Well, I have a few friends right now who learn programming. I have been programming for roughly two decades now and yet (or because of that) it is very hard to give them simple advice that helps and that they appreciate.

                                        At their current stage the code they write is mostly consumed by them. So it has to work for them and nobody else. Their programs have only 1000s of lines.

                                        What paradigms would you suggest? They program mostly in JavaScript and VB (which I don’t know well).

                                        I tried things like - avoiding global mutable state - composition over inheritance - do not provide indirect dependencies to constructors - get rid of exposing getters/setters for no reason - …

                                        Of course, this is based on the code that I saw from them. It is very hard to concentrate on some key things without overwhelming them.

                                        So even though SOLID might have faults, I really appreciate the idea to find a starting set of heuristics for design.

                                        1. 2

                                          When I was a Ruby newbie, I read Avdi Grimm’s “Confident Ruby”. It was built on SOLID and a few other practices. However, the book itself gives specific examples of a problem and how it can be solved.

                                          Unfortunately, I don’t know about any such resources for JS or VB, but you could definitely look for something that represents abstract principles (like SOLID) on concrete examples. At least for me, it was way easier to learn things this way.

                                        1. 9

                                          This is one of the most compelling features of magit, and what made me actually use it over the git cli.

                                          M-x magit-status launches a window that contains all your staged and unstaged code, and you can highlight individual lines of the diff to remove or add them to your staged changes. It also includes a stash UI as well.

                                          1. 6

                                            God yes, that and +/- to expand/decrease where the hunks start (be careful getting it too close to one line). s/u to stage/unstage things (can even do this if you highlight stashed hunk as well, soooo nice).

                                            It really is the best git ui i’ve used.

                                            1. 5

                                              Funny you should mention magit. So, I’m learning Common Lisp, and I was using Atom + Slime. Then I got really annoyed by Atom (I’m sure I could have reconfigured it given some study, but the Slime integration started to get wonky too). I took the plunge and have started using emacs (which isn’t that bad, once I figure out what M-x meant :P) Then I read a thread on HN about altassian buying trello (comments are hillarious) but ran into magit, and learning about magit I learned about hunks which seems to me one of those things about git that should be advertised more prominently.

                                              1. 2

                                                magit is really well done… shows the power of emacs customization.

                                              2. 4

                                                Or use @andrewshadura’s git-crecord.

                                                I do this workflow all the time using mercurial, trying to keep my commits as atomic as possible.

                                                1. 4

                                                  I remember the first time I needed to stage split chunks of a file and thought “I wonder if I can just highlight these lines and hit s… I can!” Bravo magit devs.

                                                  1. 2

                                                    For years, magit was the only piece of emacs that I kept on using. It is great.

                                                  1. 3

                                                    I have used CMake about 10 years ago. I remember being very confused at first but then after learning some basics very happy.

                                                    It was great for allowing a Visual Studio setup and at the same time providing a convenient way to build the project on the CLI. Cross-plattform AND cross-build tool. And much faster configure step than auto-tools.