1. 3

    Don’t have experience with formal verification specifically, but some general advice:

    1. Adjacent roles is good. Companies with more obscure technologies are often more willing to train, since they have no choice. And small startups will hire for much broader positions than the actual job posting, e.g. I was hired for backend skills but in practice I’m learning image processing as applied to gene sequencing.
    2. Learn and blog is a fine thing too… if you have the time. I don’t, so tend to go for adjacent jobs if I’m interested in particular skillset (in this case, I wanted to learn scientific computing).

    If you don’t have skills upfront, you also want resume that stresses your ability to learn quickly.

    (Longer writeup here: https://codewithoutrules.com/2018/01/23/job-with-technology-you-dont-know/)

    1. 1

      Thanks! The tip for resumes is a good one. Sounds like you’re having fun learning in your new role.

    1. 21

      @itamarst it’s not really clear to me what a resume looks like to you. My resume’s have never had paragraphs in them, although they might have a some sort of opening sentence that sums up my ambition.

      I would argue that you aren’t really emphasizing the right thing. My resume’s are target-specific but tend to be a list of work experience with the important things I did listed in there, like “Lead a team solving X”, or “Designed interview structure” (with more filler words).

      It’s also unclear to me if this blog post comes from your experience being a hiring manager for hundreds of candidates or if you are just pulling out of your butt what you think would make a good resume. This is another post which feels like you’re aiming for IT Guru status rather than dispensing useful evidence based information.

      1. 2

        In case you too aspire to guruhood—

        Cons of being IT Guru:

        Pros of being IT Guru:

        • Get to share what I’ve learned over 22+ years of working as a programmer.
        1. 17

          Not trying to sound harsh, but this is a low quality response. Setting aside the IT Guru stuff I think @apy had some good implied questions.

          What does a resume look like to you? Does it contain paragraphs with narrative description like a cover letter? My resume and most I’ve seen in the U.S. have sections with lots of short bullet points, not paragraphs per se.

          Are you drawing this advice from your experience on the hiring manager side or the applicant side or both? I’ve found that the Software Clown style posts which use concrete examples are much better at conveying your message. Are there times you’ve had direct feedback on your resume? What about times you’ve passed on a candidate because of their resume?

          1. 2
            • Paragraphs or bullet points is besides the point. You can have bullet points too, it doesn’t matter. It’s the content and location that matters, so long as it’s professional looking and readable.
            • This is based on decade plus of reading resumes and then interviewing people (as developer, usually, not manager) and seeing how much was missing from resume.
            • I’ll see if I can add a concrete example.
            1. 8

              And most importantly, do you have any evidence that doing the change you suggest actually affects one’s success in the hiring pipeline rather than just being your preference of how resume’s should look? I could write an opposing article that says the most important aspect of a resume is that it uses exactly 3 distinct colors, how is a reader to judge the quality of such advice?

              1. 1

                Yes, I would like to see some evidence as well, basically where I live, hiring managers will ask you about technologies, usually with a programmer/developer/team member of sorts with them, to evaluate you. It will also be frowned upon if you at least don’t list something in your resume, and if you have phrases like the ones described in other comments in this thread, devs won’t particularly like your resume. Last guy we hired had a simple latex based resume, that was clean, listed his experiences, how he worked with others in the relevant projects, his known technologies, working projects (github/git repo), and today he is invaluable to the team. (edited for grammar and another point)

                1. 3

                  Where are you from? Different places have different hiring cultures, yes, and what I’m writing is mostly broadly focused on US market. In the US approaching your skills as a list of technologies is asking to be treated as a commodity: easily replaceable if someone comes along with same list at lower pay. You can do it, but you don’t want to. If an interview involved a manager with a checklist of technologies I’d probably just walk out, because I’d be pretty sure pay would be vastly lower than what I can get elsewhere. And if developers were annoyed by focusing on solving problems, rather than churning out code, working with them would be unpleasant.

                  Notice, however, that your coworker mentioned how he worked with others, not just technologies.

                  Likewise, Apy’s resume starts by listing actual work he did not, not a list of technologies.

                  1. 1

                    Yes, things are different here, I’m from South America. I guess the difference here is that we rely more on the interviews that we perform, in my opinion anyone can write about how they did something, they may be lying or may not, and we want to feel that in the actual interview.

                    It seems to me now that here, we want to feel how you say things in an interview, we want to see you do it, explain how you’ve worked, which in my opinion you can easily spot who was trying to fake something. Obviously sometimes you get interviews in which the candidates were not who they said they were, and that’s disappointing, but when you get someone who knows what he’s doing, you feel it then and there by talking/coding/solving problems.

                    In my opinion anyway, writing something in a resume is easy, actual talking/coding whatever process you might wanna choose to evaluate them, is the real deal

      1. 7

        There’s two sets of issues:

        1. The job requirements listed in most job postings are technologies, even though that isn’t really what the company is looking for.

        2. Most resumes omit huge amounts of relevant information, and again often overfocus on technologies.

        To become a senior developer you need some technical skills, but also to be able to work independently. That means being able to scope a project, know when to ask for help, prioritize, learn new technologies on your own, etc.. Almost no one puts this on their job postings because they can’t quite articulate it; instead they put years of experience or random list of technologies they use, conflating “knows this technology” with “will get started on this quickly/can operate independently”. Not the same thing at all.

        On the resume side, often it’s “I did a thing!”. You also need to give context, why this thing was needed, and outcomes, why this thing was useful. And also there’s some stylistic stuff like, yes, no one reads the resume, they skim it at best.

        So you need to make really sure it shouts INDEPENDENT WORKER YES I CAN WORK INDEPENDENTLY at the top, and it’s not buried 3/4 down the bottom of the page as an implied side-effect of a project scope that isn’t actually clear because the person interviewing doesn’t work at your company and so has no idea how impressive the thing you did actually was.

        If you can share:

        1. The skills you’ve seen listed, where you’ve said “I can do this!”.
        2. Your resume.

        then can probably give specific advice.

        (I write a little bit about this here - https://codewithoutrules.com/2017/07/16/which-programming-skills-are-in-demand/ - but I should probably do a “here’s how you write a resume” blog post, since I have Opinions.)

        1. 2

          Yeah I subscribe to your website, really enjoy the articles :) I’ll PM you my resume info. I was hoping to learn other’s experiences in this thread, not ask for unsolicited career advice.

          1. 1

            BTW, from your blog post:

            Learn the problem solving skills that employers will always need. That means gathering requirements, coming up with efficient solutions, project management, and so on.

            Learning these skills is one thing. Demonstrating that you’ve learned them is another. Hiring managers don’t just want to see “project management” listed on your resume, they want to be sure that you can actually perform those skills (after all, their hiring decision is a multi-thousand-dollar bet on you, they want to be sure that their bet pays off). Could you speak to some techniques one could use to demonstrate these skills?

            1. 2

              I’ll try to write a some more when I have time, but here’s a quick example on the resume level. Let’s say you’re applying to a job where you don’t know the technology stack. And sadly it’s a cold application with no one to introduce you. You’re sure you can do the job, but you need to convince them. So:

              • You probably want to say “I can learn new technologies quickly” in the first paragraph or two of the resume, because otherwise they might just skim your list-o-technologies, miss the thing they think they need, and drop your resume in the trash.
              • You also want to give a concrete example.
              • You want to demonstrate that the skill had real business value.

              So e.g. you can have opening paragraph or bullet list at the top of the front page that has bit saying “I can learn new technologies quickly, as I did at a recent project where I fixed a production critical bug on day one, even though I hadn’t worked in Java before.”

              1. 2

                As someone who is at a startup that’s currently hiring, I when I’m skimming a resume, I’m looking for experience, not skill lists.

                You can say “Project management” on your resume. But it’s far better to say “Was project manager for project x, successfully handling cross-team portions y and z. Project x shipped early and under budget”, and show me that you’ve managed projects.

                If you want to convince me that you’re a senior developer, your resume should reflect that you’ve been doing the kind of work that people expect from a senior developer. Leading projects, mentoring more junior colleagues, shipping major features, etc.

            1. 30

              I’ve worked part-time for about six years of my career. I started it because I’d repeatedly burned out of full-time jobs. Working 3 days/week was great for me, far more rewarding than the added salary I passed on could have been. Aside from lower work anxiety, I had time to write two books, give three conference talks, get engaged, get married, take up several hobbies, and enjoy life thoroughly. My work has been overwhelmingly better: I stay out of rabbit holes, I recognize deep patterns, I prioritize ruthlessly, I deliver the things my users didn’t realize they need. It’s not magic, it’s just downtime for my unconscious to noodle around with problems without pressure.

              I think working part time is a hugely valuable experience for anyone who doesn’t have a pressing need for dollars in the door (eg to pay off US medical bills or student loans). There are plenty of blogs out there on frugal living + investing (I recommend MrMoneyMustache and Bogleheads wiki), so developers can easily live comfortably and still save significantly towards retirement.

              1. 5

                I’m trying to pull back my working to part-time as well. Unfortunately many companies seem to want full-time or nothing. I’ve switched over to consulting to give me more freedom, we’ll see how that goes. I’m taking around 1.5 months off from work right now which is great. For the first few weeks it felt awkward to have no reason to do anything at any particular time, but after awhile it’s become really pleasant.

                1. 12

                  About a year and a half ago I stopped working full-time, and it’s been really wonderful. I found I can work 2 months on a SF salary and live for a year in Berlin pretty comfortably. Sometimes I pick up more consulting work when I want structure, and sometimes I think about moving back to NYC where I would have to work a little more regularly, but I wouldn’t change anything about how I’ve spent my time up until now. I’ve been able to dive really deeply into a bunch of things I would never have had the time or energy to pursue if I were still a wageslave. The things I’ve built in my free time have also turned into tons of job opportunities, and I’ve stopped doing technical interviews now that people can just look at work I put on github and stuff. So, it can lead to lots of nice career things, too. I don’t want to stop engineering, but I am quite happy to live life outside of a shitty startup office a bit more.

                  Almost no jobs will appreciate it when you tell them you’d like to work less. But if you go into a new thing with clear expectations set, I’ve found it to be much easier.

                  1. 6

                    This is awesome! How do you go about getting consulting work - do you look for clients, or do they approach you? Did you have a ramp-up period before you felt comfortable that you’d have enough consulting work when you need it?

                    1. 3

                      I think most opportunities come my way because I genuinely enjoy talking to people about systems and engineering, and when I don’t have a full-time job I can spend more time communicating about those things. It’s networking, but for something that doesn’t feel gross to me. I am lucky to have this alignment between my personal interests and what businesses currently value. My current gig came from a comment I made here on lobste.rs ;)

                      A key to being comfortable is having enough runway where I know I will be OK for a while if I don’t find any work. This means being careful about burn rate. I consider potential purchases and recurring obligations in terms of how much life without work I’m giving up to have them. When my friends from work were increasing their rent to keep up with 30% of their salaries (or more) I was building the buffer that would keep me calm without work. They are worth a lot more money than me now, but I’ve been able to grow in ways that I’m extremely grateful for. Also after quitting my last full-time job I went through a period of intentional “unlearning of engineer-in-a-fun-city spending habits” which gave me a lot of peace of mind, by tripling my runway.

                      When I decided to stop working full-time, I didn’t know if it was going to just be a long break or a whole new arrangement. After getting over burnout from the SF startup I was at, I cold-emailed a company doing interesting work to me, and they enthusiastically agreed to a 10 hour/wk contract. That showed me that I might be able to keep it going.

                      When you pay 1/7 the rent, even a small trickle of engineering work feels like a geyser.

                      1. 1

                        Thanks, this is an excellent approach.

                  2. 3

                    Unfortunately many companies seem to want full-time or nothing. I’ve switched over to consulting to give me more freedom, we’ll see how that goes.

                    While this is true, as Mike points out in the interview it’s possible to convince some companies some of the time to hire you part-time anyway. It’s much more effort, and you need to be willing to push back much harder. But it can be done. Since it’s not the default, you really want to only mention part time bit after company has committed to hiring you.

                1. 2

                  https://landinstitute.org/ are working on perennial grains (with some initial successes), which would massively reduce the ecological impact of grain farming.

                  1. 2

                    Book on networking: “Silence on the Wire” is a strange and wonderful book. Each chapter is half intro to some basic topic in networking, half essay on the passive surveillance opportunities implicit in the design. Starts with Ethernet and moves up the stack, if I remember correctly.

                    1. 20

                      Interestingly, you can replace the technical content of this post with almost anything related to good human behavior. That’s because at it’s core, the post is arguing against deontological ethics (the idea that whether something is good is determined by whether it adheres to a specific set of rules). However, I find the argument against the terms “good” and “bad” misplaced.

                      As an alternative, you might try viewing things through the lens of virtue ethics, which argue that “good” and “bad” are useful terms, but are always relative to some particular goal. In human behavior overall that goal might be called “human flourishing.” And in the case of code, that would be something like “achieving it’s organizational purpose without causing excess detriment.” Or more concretely, “taking online orders with an acceptable failure rate.”

                      Edit: The virtue ethics view is thus that any time you call something “good,” it means good for some particular thing, even if we aren’t always explicit about what that thing is.

                      1. 3

                        Thank you! Another entry for my “why programmers should study humanities” list.

                      1. 13

                        Clickbait title.

                        Also, there is bad code. Can we stop with moral relativism in a field where we can actually measure how shitty code is? Where we all have direct personal experience with mudballs (often of our own creation)? Where we’ve all had to pay the price for badly-written code that badly performs while doing a bad thing badly?

                        I’d rather read a “Your code is probably bad and you should probably feel bad. Here’s how to write less bad code.”

                        (And before you ask, yes, all code is varying degrees of bad.)

                        1. 6

                          There are plenty of non-programmers who write “bad” code that created something useful for them. Telling them they should feel bad is the attitude I don’t like. Certainly you can teach them some of the skills you have that they don’t, but that means explaining why they should do better… and that ends up being tied to specific circumstances. You’re not going to tell someone writing Excel macros about formal verification methods: they don’t care and they don’t have time.

                          Software is just a tool. If the tool succeeds at its purpose, then it’s a useful tool.

                          1. 2

                            non-programmers who write “bad” code that created something useful for them. Telling them they should feel bad is the attitude I don’t like

                            No one suggested doing that?

                            Code that does something useful can still be bad code, and its badness can even turn out to be costly to whomever wrote it no matter how adorable that person is.

                            You’re not going to tell someone writing Excel macros about formal verification methods

                            No one has suggested that either. I don’t feel like looking it up to confirm, but this smells like strawmanning.

                            Bad code does actually exist. Whether it achieves something useful in the real world is irrelevant to that.

                        1. 19

                          The advice in these CodeWithoutRules posts just seems so…trite? Maybe I’m biased but they seem to be more about getting the author’s name out there rather than giving well thought out advice. For example, the author gives an example of a company using Perl, and then goes on to say what you can do about looking tech up and talking to your manager. Is the author saying that nobody had the unnamed company has ever done this? That seems unlikely to me. IME, technology change only occurs under existential threat, not because someone thinks X will make you a bit better. The author also seems to put a lot of weight on the age and popularity of technology, not if the technology actually is better for the problem it’s solving.

                          In my career, I have rarely seen the advice the author gives actually work, and in the cases I can think of, it’s been due to a crisis not a marginal improvement. Experiences vary but the author doesn’t actually enumerate any real world successes they have had. Maybe that’s what bothers me about these CodingWithoutRules blog posts, they seem like Feel Good Messages, disconnected from reality. But maybe I’m just cynical.

                          1. 8

                            Maybe I’m biased but they seem to be more about getting the author’s name out there rather than giving well thought out advice.

                            For the record, this is how they come off to me as well. Most of the advice reads as energentic banality, akin to a corporate pep talk, sneaking in appeals to sign up for a special publication followed by decrees to “buy my book”.

                            I may also be cynical.

                            1. 3

                              Occasionally I write blog posts about how e.g. signal and garbage collection reentrancy interact with Python threading primitives in an unfortunate way (https://codewithoutrules.com/2017/08/16/concurrency-python/).

                              But… I feel there’s often too much focus in programming on technical skills and too little on other, less easy to articulate skills. So mostly I try to write about those other skills. They’re harder to explain, and so I don’t always succeed at doing so, but they’re important and useful too.

                              1. 2

                                With the caveats that writing is a difficult and distinct skill with many subskills, putting your writing out on the internet for peanut gallery hecklers like myself takes courage and initiative, and further that productively engaging with criticism of your work is admirable: The content of these articles seems to trend a lot more towards a first-principles-y style of communication. This works well for articles like your post on the deadlock behavior queue.Queue exhibits in some interactions with __del__, since the behavior hs a concrete, single-source cause.

                                Unfortunately, human interactions are a goddamn mess that don’t respond well to reasoning from first principles.

                                If you’re invested in writing insightful posts about dealing with imposter syndrome (“You feel like your growth has been stunted: there are all these skills you should have been learning, but you never did […]”), you may be better off looking into the effectiveness of self-esteem versus self-compassion. If instead you’d like to write about effective learning strategies (“Get your technical skills up to speed, and quickly”), a discussion and extensions of work like Iran-Nejad’s Active and dynamic self-regulation of learning processes would be more appropriate. For how to work effectively in a complex sociotechnical system, Simon Sinek and Edwards Deming and Thomas Schelling and Donella Meadows are all extremely insightful authors with much to say (particularly the late Ms. Meadows).

                                That is, if your intent is to write cogently and effectively about disciplines foreign to the hard logic of a turing machine, of which there certainly is a need in our industry, your work needs to exhibit more care and attention to prior art in those disciplines. Excepting that, your work at least needs hard-won anecdotes taken from (all too often painful) direct experiences that can help readers reconstruct the tools you yourself learned from those experiences. As-is CodeWithoutRules posts like the parent link are not substantive or grounded enough to do more than express a good understanding of the English language. Since you already have the courage and the practice of posting these, it’d be lovely to see the less technical posts mature into articles that are as important and useful as accessibly documenting unexpected concurrency pitfalls in Python.

                                1. 1

                                  Since you already have the courage and the practice of posting these, it’d be lovely to see the less technical posts mature into articles that are as important and useful as accessibly documenting unexpected concurrency pitfalls in Python.

                                  I think this is a good point. I’ve nothing against theme and topics in this post – and I do not wish to discourage work in the area – but the writing is not doing it for me. As @apy stated, it sounds too much like the words of a self-help guru.

                                  I also agree that it takes courage to put it out there.

                                  1. 1

                                    Your argument, as I understand it: I should be writing posts based on either prior research, or empiricism. I certainly agree with that, and I have e.g. done reading on the research on learning (https://codewithoutrules.com/2016/03/19/how-learning-works/).

                                    Let’s look at this particular post, then, and see how it does. It has the following outline:

                                    1. Topic: “Your skills are obsolete: now what?”
                                    2. You, the reader, find this distressing.
                                    3. Almost old projects use old technology.
                                    4. Therefore, you can upgrade this new technology where suitable; here’s some tips on how to do so.
                                    5. Not seeing technology that needs upgrading is an instance of a bigger problem: waiting for problems to be handed to you, rather than actively seeking them out.

                                    Expanding on each:

                                    1. The topic is “obsolete skills”. Not impostor syndrome, not learning techniques.
                                    2. This is a real situation. The particular sentences you quote are based on someone in this exact situation: they’re not suffering from impostor syndrome, they really are in a bad place. I have other observed other programmers in this situation. @neonski has observed people in this situation (see his comment elsewhere on this page).
                                    3. Use of old technology is an empirical observation, or maybe a well-known fact given e.g. state of security updates. I could find some research to cite, but it wouldn’t add much to the article.
                                    4. This is a suggestion based on empirical observation of a skill I and many other programmers have. In particular I, and many other programmers, will simply go ahead and upgrade, or suggest upgrading technologies every day at work. Every other week I end up researching new technology on the job. I almost never learn new technology at home. From the perspective of someone who has been in technical leadership of a team, junior people saying “hey here’s a problem, here’s a solution, can we try it” is always great.
                                    5. This is standard engineering skill tree progression. E.g. Charisma column of https://docs.google.com/spreadsheets/d/1k4sO6pyCl_YYnf0PAXSBcX776rNcTjSOqDxZ5SDty-4/edit#gid=0 but that’s just an easily findable version. Pretty much every organization that bothers to write this down has a similar progression.

                                    It’s possible my full blog post didn’t do the best job of expressing this outline. But there’s no reasoning from first principles.

                                  2. 1

                                    So mostly I try to write about those other skills. They’re harder to explain, and so I don’t always succeed at doing so, but they’re important and useful too.

                                    The flip side is, like any self-help advice, it’s much easier to bullshit than the hardcore tech stuff. I’m sure you have good intentions but writing about stuff that is easy to make up because nobody knows the difference means you have to be all the better of an author. For me, at least, and I mean this in the most constructive way possible, it’s really hard to tell if you’re some self-help guru trying to make a name off suckers or actually have insights.

                                2. 6

                                  The company using Perl is pretty much stuck with Perl for years to come, though I can imagine changing over a decade or so, component by component. But, they also uses many other technologies, and those have been upgraded over the years:

                                  1. At one point they switched from doing all communication between systems via Oracle to communicating via RabbitMQ.
                                  2. They have a web UI, so plenty of opportunity to upgrade there, though I don’t know details.
                                  3. They tended to use end-to-end tests, but at some point there was a push towards adding unit tests as well, which involved introducing new Perl libraries.

                                  No doubt many other changes as well.

                                  Those changes were all introduced by someone. Sometimes by people in charge, but I know some were introduced by lower level employees.

                                  1. 1

                                    Those changes were all introduced by someone. Sometimes by people in charge, but I know some were introduced by lower level employees.

                                    Sure, but were they introduced using the technique you suggest? IME, a lot of these happen because when nobody is looking someone just Does Something and now we’re stuck with it, for better or for worse.

                                    Just because companies do change doesn’t mean they used the method you suggest. Have you successfully used the method you suggest? If so, some first hand experience would at least lend some strength to your statement, right now it just feels like your logiced your way to how this must work and wrote a blog post about it regardless of if it’s reality.

                                    1. 1

                                      Sure, but were they introduced using the technique you suggest?

                                      Talking to one’s manager and pointing out a problem? That did happen, yes: I believe the unit test library they used was proposed by someone to the team lead. And I’ve certainly done this many times.

                                1. 1

                                  The troubling part to me there is pushing your personal development at the expense of employer. The advice is to locate a problem (no shortage of problems in live systems, sure), and convince the management that the new tech is a suitable solution. Mind you, the management is not necessarily technical to evaluate the merits of it, and you at this point are not familiar enough with this new tech yourself to make an informed decision. This does not sound entirely honest, neither to yourself nor to your employer.

                                  1. 3

                                    If your manager can’t decide, and you can’t decide… nothing will ever change unless you happen to have someone with pre-existing knowledge.

                                    This is why the suggestion is to do a pilot project, not to just rip out everything and replace it. Maybe pilot project will prove this is the wrong technology, or help you understand use cases better.

                                  1. 14

                                    The programming language you know best has been declining in popularity for a decade.

                                    This is great news!

                                    Everyone who bought enterprise on that language now has to pay a premium, since developers are obviously harder to get!

                                    A large company, founded in 1997, built the initial version of their software in Perl. At the time Perl was a very popular programming language. Over the past 20 years Perl’s mindshare has shrunk: it’s harder to hire people with Perl knowledge, and there’s less 3rd party libraries being developed. So the company is stuck with a massive working application written in an unpopular language; the cost of switching to a new language is still too high.

                                    See? I can charge 20%-200% higher on perl programming since that’s surely going to be cheaper than rewriting it!

                                    Once you’ve identified a problem technology, you need to find a plausible replacement.

                                    Wait why? Where problem is defined as Technologies that are shrinking in popularity; some Google searches for “$YOURTECH popularity” can help you determine this, as can Google Trends.? Absolutely not.

                                    I had to skip back to see the issue: The language of the author switched midway between my (the programmer; who knows things and is interested in job listings), and my company (who is responsible for a project, for business, etc).

                                    I the programmer, individual contributor do not want to switch technologies. In fact, looking for unpopular languages and at-risk companies is a good way to become a higher earner.

                                    I the company? This is risky advice: Enterprise software that has sunk $2m in development is going to take 10 years before the 20% increase in developer salary is going to matter, and even at that point, you haven’t factored the cost of rewriting it.

                                    Okay, but what about “without changing languages?” i.e. the “Format.js” library example? Yes, you can try the script the author suggested, but the manager is trying to justify the costs (perhaps in her head), and you’ll make everyone’s life easier if you simply justify the costs.

                                    1. 5

                                      I the programmer, individual contributor do not want to switch technologies. In fact, looking for unpopular languages and at-risk companies is a good way to become a higher earner.

                                      This is true… under certain conditions. Some technologies just fade into oblivion too fast to make extra money, some people aren’t good at marketing themselves or finding these niches. Some old technologies end up only used in niche geographies, and if you live in wrong place you’re out of luck.

                                      It’s a viable strategy if you consciously set out to do it, and choose a tech stack with a conservative userbase that has deep pockets. Lots of people end up there by mistake, though, and don’t know how to get out of the hole.

                                      1. 3

                                        Agreed. I read this trope a lot, the COBOL programmer making 1000$/hour or whatever, but I rarely meet any, probably because of that niche issue.

                                        I however meet plenty of people who spent too much effort becoming framework experts and then really struggle to adapt as technology (or fashion) moves past them.

                                        1. 1

                                          Institutional knowledge can seem dangerous because it doesn’t have the same value to your current employer that it will have with your next one.

                                          Is there nothing else you have to offer? If you know Symphony but not Magneto, is there nothing you can leverage? I think this happens a lot less often than we believe, and it’s really more as itamarst suggests: a marketing problem.

                                        2. 1

                                          I’d be surprised to meet an experienced COBOL consultant who thought the grass would’ve been greener had she done C instead, and should I meet them, I don’t think their issue looks very much like a library problem.

                                          I’m not sure I buy the geography argument either: I commute internationally for work because I have a number of very niche skills, I like where I live, and plane travel is nothing compared to my day rate.

                                          Marketing is a real problem though. If I could get you a 30% higher salary, would you give me 15% of it?

                                      1. 5

                                        It’s an interesting article and most of the tips given are useful, but I disagree with the byline. Becoming REALLY proficient with your editor, learning how to use testing effectively, and becoming intimiately familiar with your platform of choice, creating shortcuts that let you shave precious seconds over and over again DO make you more productive. They’re not the whole story for sure, but that’s not what the byline says.

                                        1. 9

                                          Suggest: “Technical skills alone will not make you more productive.”

                                          1. 4

                                            Yeah the title is kind of dumb, but I agree with “don’t lose the forest for the trees”. You can be super “productive” on a task that is not getting you (or your employer) anywhere.

                                            1. 3

                                              Updated the original; don’t think I can edit this submission’s title at this point.

                                              1. 2

                                                The article was good but just needed better title. You fixed it. You’ve done enough. :)

                                          1. 3

                                            Sane Workweek Test: a questionnaire for companies that want to demonstrate they have a sane workweek, ala Joel’s Test. Help me find a better name, before it’s too late!

                                            1. 1

                                              “Life Balance Test” “Got a Life Test” “4-40 Test” (Assuming a good metric for starting paid vacation is 4 weeks)

                                            1. 4

                                              This blog post is interesting by describing an coherent approach, but lacks answering two important questions: better then what, better for whom?

                                              It’s obvious that the author only gives a rough handwave towards an unstructured approach to learning a programming language and moves on to calling theirs “better”. It’s also becomes clear from the text that this works better for the author. For example, the ability to learn on the job is not one everyone has.

                                              That’s completely fine, but this is not the better way to learn a new programming language, but a way that works for this author (and most likely for many others). It shouldn’t be framed as anything else.

                                              1. 3

                                                I give a specific set of criteria for how to make learning easier: motivation, specific goals, learning one thing at a time, not starting with blank slate, access to help from others. Meeting those criteria is what makes for a better learning experience.

                                                In general, I believe this is easier to achieve at work, though of course there are counter-examples. The important thing is matching those criteria.

                                              1. 5

                                                One article you might be interested in is Logging messages is perilous (lobsters link), which covers a lot of the same ground.

                                                1. 3

                                                  Yeah, Richard and I discussed this bug on IRC a while back.

                                                  1. 1

                                                    One day I should actually implement a fix. ❤

                                                  2. 1

                                                    That post was sparked by an IRC conversion with itamarst about this issue a few months ago. :)

                                                  1. 2

                                                    Decided to do the post-to-lobsters-first thing this time; would love to hear your comments before sharing further.

                                                    1. 5

                                                      Your opening is a mind-blowing example of bad science and damage from a program that I’m just hearing about. Although normally I use medical or exploding rockets, I think I’ll add this one to list of costly bugs I use when arguing for engineered software. Does anyone know an estimate of how much austerity cost Greece on top of whatever else was going on? And has anyone checked the specs and source for what produced that estimate? ;)

                                                      Far as adoption, there’s a lot of people using Matlab, Python, and R. I think investing heavily in fire-and-forget tooling that spots common things for those languages might be a start. Then, an easy way for them to write specs of what they’re doing that can feed into other checkers. Especially catch data, interface, consistency, or ordering errors. There used to be CASE tools for a lot of stuff like this with some success in the market. They mostly failed due to overpromising and being greedy. Honest, FOSS tooling might fair better.

                                                      1. 8

                                                        Greece is a little different, insofar as it had austerity forced on it pour l’encouragement des autres.

                                                        And it’s not helped by fact that mainstream economics and its love of austerity are basically a form of religion as advocated by Hobbes in Leviathian: a means of keeping the hierarchy in control. It’s certainly not a science (Debunking Economics is a good book on the subject, because it debunks economics using its own intellectual assumptions; other attacks on mainstream economics are usually along the lines “but it forgets about X or Y”.)

                                                        1. 9

                                                          Mainstream economics is involved to some extent, but there’s a good deal of domestic EU politics exacerbating the Greece debacle (partly to make an example as you note, but even beyond that). Even the IMF, which is pretty well known for a certain kind of by-the-books mainstream economics and support of austerity measures, has been distancing itself from the approach taken in Greece, because IMF in-house economists don’t believe the current program will work. They attempted to push for either some kind of debt forgiveness or rolling over debt into long-dated, low-interest bonds (which is a sort of “soft” debt forgiveness), but the EU wouldn’t go for either option, because EU finance ministers from countries like Germany and Finland didn’t think they could sell it to their domestic voters. Part of the problem there is that a number of countries have invented a kind of national-chauvinist mythos where they’re hard-working, responsible northern Europeans, in contrast to those layabout, profligate southern Europeans. Once you get that kind of national superiority and morality-play in the mix it doesn’t matter what the economics say.

                                                          1. 3

                                                            The IMF would have been bankrupted if Greece and defaulted - which is part of the problem…

                                                      1. 5

                                                        Automated tests and code reviews are fundamentally different: the former enforce stability. Stability may involve staying incorrect! The latter can catch whether your code matches the spec. The spec may be wrong! See https://www.youtube.com/watch?v=Vaq_e7qUA-4&feature=youtu.be&t=63s (I really need to make a prose version of that).

                                                        What’s missing is a third kind of testing: comparing against reality, to the extent possible.

                                                        I’m going to start job doing gene sequencing data processing pipeline, so this is the problem I will be wresting with over next few years. I asked about current approaches, and the answers are suggestive:

                                                        1. Do the answers make sense? E.g. if you know the 80% of RNA is a thingummie, but only 5% of sequenced RNA data is thingummies… probably you screwed up the sequencing.

                                                        2. Do the answers match a previous, reasonable model? E.g. humans can find cell borders on a photo. Do the software’s borders match what the human did? There’s some tricks you can do to make this faster, e.g. visual diffs are a neat way to combine computer processing with human image processing abilities.

                                                        1. 4

                                                          Testing was originally used to prove correctness. One can also test against known spec for stability. That’s regression testing. One can even generate tests directly from formal specs. That’s just one goal, though. Another set of strategies for automated tests assume the spec or intuitive understanding of the algorithm might be wrong. They typically look for wrong outputs or crashes (esp with fuzzers). Testing can also be used to help the developer better understand the domain or black-box implementations. Finally, testing can be layered in combination with other assurance techniques on the same properties but just as a check against the failure of an individual technique.

                                                          The Wikipedia article on Software Testing lists many different types of testing with their goals if you want more information on the topic:

                                                          https://en.wikipedia.org/wiki/Software_testing

                                                          1. 2

                                                            I think automated testing against “known good” should be able to enforce correctness as well as stability. As I understand, Genome in a Bottle is an initiative in that direction.

                                                            1. 1

                                                              Automated tests and code reviews are fundamentally different: the former enforce stability. Stability may involve staying incorrect!

                                                              Particularly important for scientific code. Segfaults are way better than code that runs to completion and gives you the wrong floating point numbers.

                                                            1. 7

                                                              Week two of time off in-between jobs. Last week:

                                                              1. Open source maintenance on crochet, my make-Twisted-usable-everywhere library, and eliot, my causal logging library.
                                                              2. Finished my book, “The Programmer’s Guide to a Sane Workweek”.
                                                              3. Blogging.

                                                              This week:

                                                              1. Figure out how to sell an ebook without having to worry about sales tax or VAT MOSS.
                                                              2. Queue up Software Clown emails.
                                                              3. Maybe something not involving computers? I have a cider-making kit that I haven’t gotten around to using yet.
                                                              1. 2

                                                                Remember, the sooner you start a batch of cider, the sooner you get to enjoy it!

                                                              1. 2

                                                                add = synchronized(add) doesn’t preserve the docstring either, so at worst decorators are API-neutral.

                                                                1. 2

                                                                  The point, which I will try to make more explicit, is that decorators are a wasted opportunity: it could’ve done the right thing in the language level, but instead it forces library code to fix the edge cases the language doesn’t handle for you.

                                                                  1. 2

                                                                    I can see two reasons why the current behavior is preferable:

                                                                    1. Explicit is better than implicit. If it automatically switched the docstring, it would make decorations have a different behavior from manually wrapping a function, which adds magic.
                                                                    2. What would be the expected behavior for functions that return decorators? For example, you don’t write @lru_cache, you write @lru_cache().

                                                                    Edit: another case is if you explicitly want to do something with the docstring, like make a depreciated decorator. This would be a lot harder if decorators automatically used the wrapped functions attributes.

                                                                    1. 5

                                                                      Responding respectively:

                                                                      1. It’s not like @something is particularly explicit. It’s a language feature that does a thing, seems reasonable that it would do some other things too.
                                                                      2. Not sure that’s relevant? In the end you’re returning a function, and that is what gets mutated.
                                                                      3. Could’ve been solved with different syntax. Current syntax is not ideal. The decorator factory case always confuses me :)

                                                                      More broadly: decorators were added in 2.4. functools.wraps was added in 2.6. So fundamentally the use case of callers wasn’t addressed by Python all until then.