1. 5

You could have both as of the 1960’s with Dijkstra’s methods. Cleanroom and Eiffel methods showed it too. Point being these aren’t things that should weight against each other in a general platitude. They’re pretty equivalent in importance and benefit in the general case. Helping one helps the other.

1. 6

Could you drop a few links that you think would be a good introduction to what you mean?

1. 6

Dijkstra’s method was building software in hierarchical layers with verification conditions specified in them. Each layer is as independently-testable as possible. Each one then only uses the one below it.

https://en.wikipedia.org/wiki/THE_multiprogramming_system

Cleanroom was a low-defect, software construction process that used a functional style w/ decomposition into simpler and simpler forms with straight-forward, control flow. It was verified by human eye, argument, tests, or formal methods.

http://infohost.nmt.edu/~al/cseet-paper.html

Eiffel combined a safer language, OOP, and Design-by-Contract to increase correctness while maintaining readability of code.

http://www.eiffel.com/developers/design_by_contract_in_detail.html

Logic languages like Prolog were used to go from requirements straight to executable specs with less room for error related to imperative code. Mercury is much improved language at least one company uses for this but here’s a more detailed account from a Prolog company:

https://dtai.cs.kuleuven.be/CHR/files/Elston_SecuritEase.pdf

Finally, DSL’s (esp in 4GL’s) were long used to solve business problems correctly with more readability and productivity than traditional languages. Hintjens at iMatix used them a lot:

https://news.ycombinator.com/item?id=11558465

So, those are where it’s relatively easy to do and adds correctness. I mention a lot more assurance techniques with varying tradeoffs here:

https://lobste.rs/s/mhqf7p/static_typing_will_not_save_us_from_broken#c_d7gega

1. 2

Thank you so much! I’ve got some reading ahead of me now!

1. 11

If we have correctness but not readability, that means a functional program that’s hard to understand. It’s likely we’ll introduce a bug. Then we’ll have a buggy program that’s hard to understand. It’ll probably stay that way.

If you focus on correctness you’ll have tools so that you don’t introduce a bug.

1. 7

Also, I have seen a lot of evidence that correctness produces readability.

1. 5

A common effect in high-assurance security in old days was formal specifications catching errors without formal proofs. The provers were too basic to handle the specs and programs of the time. The specs had to be simplified a lot for the provers. Yet, even if they didn’t do proofs, they found that simplification step caught errors by itself. Proofs had varying levels of helpfulness but that technique consistently found bugs.

So, yeah, it’s easier to show something is correct when it’s simple and readable enough for verifier to understand. :)

1. 4

Do you think it’s possible that pursuing correctness using the type system can cause less readability? If so, what does that look like? If not, do you draw distinctions between words like “readability” and “accessible”? Or perhaps even more fundamentally, do you see a distinction between correctness and readability at all?

1. 4

Do you think it’s possible that pursuing correctness using the type system can cause less readability? If so, what does that look like?

I know exactly what that looks like. They’re called formal proofs of program correctness. The specs and proofs can be hard to read although the code is usually easy to follow. The trick is, what’s easy for a human to verify is often hard for a machine to check and vice versa. So, machine-checked code might get uglier than human-checked code depending on the algorithm.

1. 4

Sure, that’s one extreme end of things. What if we back away from formal proofs and consider other things that are less obvious? For example, is it always true that adding a single additional type—which perhaps seals an invariant you care about—always leads to more readable code? What about two types? Is there a point at which N additional types actually hampers readability even if it makes certain bugs impossible at compile time than could be achieved with N-1 types?

(Perhaps if I constructed an example that would make this a bit more concrete. But I worry that the example will tempt us too quickly into red herrings.)

1. 4

Some people already say that looking at Ada, ATS or Idris code but usually amateurs in the language. I think the easiest example of correct, hard-to-read code is finite state machines. You can do whole programs as Interacting, State Machines. They’re usually compact, easier to verify, and easy to optimize. Many programmers cant read them, though, because they regularly use or even understand FSM’s.

1. 4

In a lot of “proofy” languages, proofs take the form of types, and the line between “normal” types and proofs is blurry/nonexistent. So strictly speaking, “proofs can hurt readability” and “types can hurt readability” mean the same thing. Pedantry aside, look at C++: unreadable clever template types are a cottage industry.

2. 3

Are there really systems where it’s not possible to introduce a bug? Seems like a pretty hefty claim.

1. 4

You can turn (some) would-be-bugs into show-stoppers during the build process. A broken build is very easy to notice. The main difficulty with this approach is that you quickly hit diminishing returns: the amount of effort required to programmatically enforce (parts of) specifications grows much faster than the complexity of the specifications themselves.

1. 3

No. They don’t exist even in high-assurance systems. That’s why most papers in that field are careful to note their Trusted Computing Base (TCB). They say each thing they rely on to work correctly in order for their proven components to work correctly. In the component itself, errors might be introduced in the requirements or specs even if everything else is correct-by-construction from them. If external to the component, it might create interactions with the environment that cause a failure mode that was previously unknown.

Just look at the errata sheet of Intel CPU’s if you ever begin to think you know exactly what will happen when a given app executes. There’s a reason Tandem NonStop had multiple CPU’s per process past availability. ;)

1. 2

If we take it to the extreme and when given appropriate tools, it’s not possible to introduce a bug in the following function:

id :: a -> a
id a = a


There is a single implementation, and you can only compile that function once you have it.

The trick is coming up with a specification and translating the specification to whatever tool you’re using. It is possible to specify the wrong thing, which is where validation (rather than verification) is needed.

1. 4

If we take it to the extreme and when given appropriate tools, it’s not possible to introduce a bug in the following function:

Sure it is. Perhaps my intention was to write a function that would square the argument, but I accidentally wrote an overly general and useless function. The hardest bugs to debug (at least, for me) are the ones where I wrote code that correctly does the wrong thing.

1. 2

I wrote code that correctly does the wrong thing.

Is where:

validation (rather than verification) is needed

1. 3

The concept of verification requires a specification. Without a specification a function cannot be correct or incorrect. It might be inconsistent or misleading if for example the function name does not match the behavior.

You can argue that the type of a function is a specification and the type checker performs the verification. Unfortunately, nearly all type systems are not powerful enough to express all the specification we desire. Exceptions would be Coq, Isabelle, and other code generating theorem provers.

1. 3

Obviously it depends on the problem, but a lot of specifications can be quite easily captured in the type system:

• Only print HTML-escaped values when outputting HTML
• Do not allow any network requests in this block of code
• files must be closed at the end of execution
• This field is not required

I know a lot of specifications are harder to capture, but a good percentage of bugs I encounter in the wild are not the “hard” stuff, but some relatively straightforward stuff that we know how to capture.

Of course, at one point the spec becomes a bit of an uphill battle, but I feel like the more interesting exploration happening in “unsafe but really far reaching” type systems like TypeScript will help us build more tools in a safe way but with a larger reach.

1. 1

Unfortunately, nearly all type systems are not powerful enough to express all the specification we desire.

But we can use common tools such as types, parametricity, Fast and Loose Reasoning and property-based testing to specify a lot of things. Not everything - but a lot of things.

We should definitely move toward Coq and Isabelle for some systems but it’s also often possible to write bug free software without going all the way there.

2. 2

Are there any examples that are more likely? I can’t really imagine writing the specification square :: Int -> Int and then implementing it as square n = n without blinking an eye.

1. 1

In that function, because it applies for all types (id :: a -> a reads, “for all types a, gives an a”), something like squaring the argument is excluded, because not all types have multiplication.

This is an example of the overlap between verification and types – in this case we see the degree to which type correctness constrains the implementation.

1. 8

It’s likely we’ll introduce a bug. Then we’ll have a buggy program that’s hard to understand.

So your argument that readability is better is by comparing a program that is readable but not correct to a program that is neither?

1. 1

I’m not the author of the post, but I imagine the point is not being made about programs where the system checks correctness.

1. 7

What is this trying to solve? We already have many abstract notations for mathematics, from eqn(7) to LaTeX to MathML and so on and so on. Not even to mention the raft of per-language mathematics languages.

(Isn’t asciimath the result of MultiMarkdown 2 trying to shoehorn “markdown philosophy” into mathematics before bailing back into LaTeX in subsequent generations?)

1. 2

From the looks of it, this is trying to solve the problem of the cumbersome and sometimes silly syntax LaTeX uses. If you know LaTeX and want to use it, more power to you, but there are books explaining its use, whereas this is specified in just a single webpage. It seems like a much lower bar to jump.

1. 2

I think part of @kristapsdz’s point is that there are already several options here. S/He explicitly gave three.

1. 3

I get the critique, I just don’t think there’s a problem with experimenting on stuff like this. Personally, this seems way easier to read when rendered just as ASCII (without MathJax) which is one of the things I don’t like about sites that use LaTeX in particular.

2. 1

It’s been a while since I used LaTeX but it’s not that hard to use for simpler formulas. I’d rather have one implementation than have to learn yet another syntax.

Of course the cleanest way to handle this is to just write everything in LaTeX and output to HTML…

3. 1

Now we need plain text output for eqn. This would be the real ASCII/UTF-8 math for me.

1. 2

I’ve never really gotten into emacs deep enough to find out, but is this the sort of thing that org-mode provides in emacs-world?

1. 2

That would be my guess, although I tend to just use Org mode formatted gists (created/updated via Emacs’s gist integration) to get private gists that render nicely. Org mode also has a concept of “projects” and can export them to HTML, if you want to have static HTML of your project, but I’ve never used that.

1. 13

I’m interested: does anyone else here feel this way too?

1. 20

I do. That’s why I posted it. I even tick off the stereotypes: white, male, programming since kindergarten.

The parts about attention to detail especially resonated with me. I thought of majoring in math where proofs can’t be shipped until they’re airtight. I ended up majoring in philosophy where I learned how to find all the holes in my own work before shipping and try to anticipate challenges before anyone else sees the argument. One thing I tend to think about along these lines is the fact that we lionize the trailblazers and creators without recognizing the value of maintaining and polishing work that has been roughed out to a functional state.

1. 23

I agree with her disdain for the current obsession with updating fast and pushing code without fully testing or fully thinking things through, but I don’t take it as me not belonging. I take it as the current trends are wrong, and I’m right. But that probably has to do with the fact that I’m 38 and have been coding for over 25 years, so I have a lot of confidence in my opinions being right.

1. 14

One of the biggest benefits of experience: being able to tell when people/industries are full of shit.

Tech, as a whole, is broken and stupid. It’s obsessed with new things at the expense of practices. It is fad-driven to a ridiculous degree. It is infatuated with idiotic status symbols like money and power in a vain attempt at relevance. It is complicit in the spread of harmful ideologies such as misogyny.

An alternate tech culture needs to emerge.

1. 3

These were also my thoughts after reading the article. Perhaps because I’m in the same age group as you, and have a similar level of programming experience.

1. 2

same here. i picked my current job in large part based on a quality-focused engineering culture; after a few years in a “ship features as fast as we can” type startup i was pretty much done with that segment of the industry.

2. 13

Honestly? No.

(As an aside, this whole article kinda feels like the modern version of “everybody in a certain class of people in New York is working on a novel or a screenplay or acting”. I’ve got some friends up there and it’s a common theme, the humblebrag hustle. The breathless way the author here describes talking with her boyfriend–husband now, I’m sorry–about Ajax is kinda silly and immediately opened up a particular bucket for me.)

(As an additional aside, Mrs. Yitbarek does have both a Github and has appeared on the Ruby Rogues podcast for a stint. She’s got actual involvement in the tech sector, but regrettably not a lot of obvious technical work demonstrating mastery or competency.)

My main takeaways about “this way” sketched by the article are:

• the software that is written today is only web software
• users have some deep emotional connection with the software they use, and we must avoid that breaking that trust
• tech industry focuses on shipping over correctness
• users are the center of our software
• tech industry is callous towards humans
• this person who cares about “understanding” a problem is somehow super different from normal developers

I don’t really agree with any of those points.

Not only do I not agree with those points, I’m actively offended by some of them.

Acting like the only software in tech is web software is hugely wrong. It ignores the vast quantity of boring line-of-business Java and C# and VB and MUMPS software that keeps the world spinning. It ignores the vast quantity of microcontroller code in places like microwave firmware and medical imaging units and car ECUs. There is a large and thriving world, however boring and bleak, outside of web development and especially outside of the coastal startup ratrace.

Acting like our users are dependent on us and are vulnerable little snowflakes who will have a breakdown if they get a broken button is belittling and worse, helps us overstate our importance. Most users just find something else if the software is broken.

Acting like best practices are completely ignored in writing new code is insulting, especially when the same author has nothing to say on large legacy systems that are difficult or even infeasible to test. It’s easy to Monday-morning quarterback when you’re fresh out of a bootcamp and think that every system needs TDD. It’s even more easy when you haven’t run into a monstrous banking COBOL blob that has 4 decades of accumulated business logic, or an embedded health IT system where it’s almost impossible to replicate the sheer crackheadedry of the production environment. Further, Mrs. Yitbarek clearly has no experience with any environment or project that does attempt to take correctness seriously, as is the case in processor design or firmware engineering or industrial automation or healthcare or avionics.

Acting like users matter is antiquated even within her own web-tech bubble, as the current best business practices involve squeezing them for all the data they’re worth and shoving ads at them. Don’t let’s pretend differently, because that’s not how the business works. It’s shitty, but it’s how startups work.

Acting like there is some culture unique to tech about exploiting users/customers is rubbish. What about healthcare, loans, broadcast advertising, clothing marketing, makeup salesmanship? That’s not us, that’s not programmers, that’s just how business works. I don’t mind a proper screed against modern capitalism, but don’t you dare tar us with that same brush, Mrs. Yitbarek. Don’t you dare lump developers and programming culture in with sociopathic MBA tricks.

Lastly, I am exceptionally disappointed and annoyed at the insinuation that everybody in tech clearly just doesn’t care to understand their problem domains. I am annoyed that she implies that she is somehow special. I am furious that she would suggest that most programmers don’t try to really grok the situations leading up their problems, and pained that she doesn’t seem to recognize there are a lot of little problems that don’t bear full analysis.

Finally, her whole tone I disagree with. Seriously, for reference:

I do not belong. My values are not valued. My thinking is strange and foreign. My world view has no place here. It is not that I am better, it is that I am different, and my difference feels incompatible with yours, dear tech.

She should get down off the cross and leave room for people that actually need it.

If this essay had been written by a pimply-faced youth in his first year of college CS, we’d make fun of how edgy and self-serious it was, and point out the depths of his ignorance. Here, though, we are supposed to take her seriously? Please.

1. 2

Acting like there is some culture unique to tech about exploiting users/customers is rubbish. What about healthcare, loans, broadcast advertising, clothing marketing, makeup salesmanship? That’s not us, that’s not programmers, that’s just how business works. I don’t mind a proper screed against modern capitalism, but don’t you dare tar us with that same brush, Mrs. Yitbarek. Don’t you dare lump developers and programming culture in with sociopathic MBA tricks.

Don’t you think, though, that to some degree we’re morally culpable for that?

If we had such a principled stand against sleazy MBA tricks, then we could have stopped it. We could have said “No” and organized or professionalized or just not worked for people like that. It is partly our fault.

Also, there are some ways in which tech culture is worse than the regular MBA culture of our colonial masters. Misogyny is one. Say what you will about MBA-style corporate capitalism, but we dialed the sexism back up from 6.5 to 11.

Tech culture is macho-subordinate– most techies brag about 12-hour days to support their employers' bottom line, but have no courage when they see a woman being harassed out of their company– in a way that plays well into MBAs' desires, but I don’t think that we invented it. We did. And even if we didn’t, we’re still responsible for perpetuating it, and need to stop it and fight it at every turn.

2. 22

I think that it’s fairly normal. The dirty secret of this industry is that 90% of the jobs are Office Space, business-driven half-assery where programmers are seen as overpaid commodity workers (hence, software management’s fetish for boot camps and abuse of the H1-B program) rather than trusted professionals.

What seems to have changed (although, the more I talk to veterans of previous bubbles, the more I am convinced it was much this way always) is that Silicon Valley itself has ceased to be any sort of exception. The difference in the Valley seems to be a much harder place to work. If you work in the hinterlands, at least you get to work 9-to-5. In Silicon Valley, it’s more like 9-to-9 due to the glut of boot camp grads who haven’t had their hearts broken yet, and H1-Bs who can be threatened with deportation if they don’t shut up and dance. If you’re going to get the same lame work experience in either place, why not move to a stable big company somewhere with a manageable cost-of-living?

Silicon Valley is good for one thing: raising capital. If you have the pedigree (read: born rich, socialized to be really good at high-end people-hacking) to raise VC and play the Founder game, Silicon Valley is the only place to do it. As for tech itself, the place is beset by mediocrity.

To be honest, I think that there probably are as many interesting companies right now as there were at any other time. The difference is that there isn’t a critical mass of them. Silicon Valley used to have that critical mass; now it’s just another cluster of rich people, a few of whom were relevant and interesting 20 years ago.

1. 1

It’s also a matter of pay scale. You don’t make in Tuscalusa what you’d make in CA or MA/Boston area.

1. 10

I think this gets overstated quite a bit. For one thing, the cost of living is astronomical in the bay area (or NYC), which must be considered when factoring salary. Also, there are plenty of other places with tech jobs - even with vibrant tech scenes, albeit on a smaller scale - where you can still make a comfortable experience-appropriate salary and work a reasonable schedule. Places where you can make a six figure salary as say a 5-year experienced web developer, work 9-5 or thereabouts, and be able to afford a house without selling a vital organ. Atlanta, Denver/Boulder area, the SLC valley, Minneapolis/St Paul, and so on. I see this justification on HN a lot, like your choices are live in SF or NYC or else make $65k/year in Tulsa, and it’s just not accurate. 1. 10 In my experience, the thing that you lose by leaving a “tech hub” is the access to a strong job market, especially if you’re older and seeking management or specialist roles. There just aren’t many on the ground. Adjusting for cost of living, you come out ahead by leaving the tech hubs. No question there. The problem is that if you lose your job (which happens more often, because branch offices get hit first and because out-of-hub companies are more likely to have capitalization problems) or if your team gets moved, you can get stuck having to move as well. Or you can be marooned in a job desert, because after a certain age (getting old sucks; I advise against it) the jobs you want are filled based on connections and rarely publicly posted. For as much as we bloviate about being high-tech and meritocratic, the way we do business is still very local and relationship-based, and that’s going to produce agglomeration. 1. 3 Totally agree. I live and work in Boston and love it. I could make more in SFBay, but then I’d have to live in SFBay, and as everyone outlined pay the cost of living penalties. Plus, I can’t drive so Boston is a better bet for me public transit wise. 1. 1 NYC would be even better for not-driving but the cost of living (mostly just housing) is higher than Boston. 2. 7 You also don’t have to spend as much in Tuscalusa as you might in those other areas. I live in MI and probably make ½-to-¾ of what I could if I moved to SV, but considering the cost of living out there, there is no way I would uproot my family just to make a few extra bucks. I consistently find remote jobs that pay me more than enough to live where I do, and couldn’t be happier with it. 1. 4 That’s fantastic! I’m always a little nervous about betting the farm on remote work - it seems to come and go in waves. Glad to hear you’re doing great and can pay the mortgage that way! 1. 3 Oh, I surely haven’t bet the farm on remote work. I live close enough to Ann Arbor and Detroit that I can (and have before) found “IRL” work :) 2. 6 Do I feel like there really are two different cultures? That the tech world all too often pretends to care more about correctness/understanding than we do? That a lot of people don’t belong here? Yes. Do I feel like I’m on the wrong side of the line? No. I often find myself arguing for a more careful approach that puts more emphasis on correctness, long-term maintainability and so on than other people seem to want to use - but fundamentally this is as a participant in a shared culture where we both agree what the success criteria are. I applaud the author for actually acknowledging the reality of the culture as I experience it. But I fear the seeming criticism is misguided. I don’t think you can get the advantages of tech without the culture of solving problems, just as you can’t e.g. do good science if you’re only looking to confirm your dogma, or do safety-critical operations without incident under a strict hierarchical culture. It’s not just a tool but a way of life, just as e.g. the enlightenment was a massive cultural and social upheval, its visible fruits fundamentally entangled with and inseperable from the cultural changes. No doubt many a medieval monarch would have liked to reap the rewards without changing the social order - but such a monarch would have been entirely missing the point. 1. 3 I care more about correctness than my job typically allows me to execute. That might be the difference. 1. 1 Often I would be inclined to a higher level of correctness than my colleagues. Sometimes due to such a disagreement we end up making a release that’s riskier than I would have liked, and sometimes these risks are borne out as a higher-than-optimal rate of client-facing bugs. But that’s just a normal object-level mistake when it happens (and sometimes we go with my view and end up making a release that’s safer than it needs to be, and that’s also a mistake). We have a shared cultural understanding that correctness is a means to an end, we all know what the measure of success is (short version: money), so we don’t get the bitter disputes of people with fundamentally different values. 2. 4 Yes. There are a lot of us. You are not alone! 1. 3 I know you asked the affirmative, but I just want to say that I do not. I think everyone who WANTS to be here belongs in tech. Yes, you will have to wade through a sea of imperfection every day, no doubt, but if this career path is truly for you, you will also experience moments of unmitigated joy and utter satisfaction. 1. 4 I very much do. While the author was in journalism, my background is science and engineering. Engineering is a process more than anything and this agile world is pretty much the opposite. I hate it! I would also love to go back to research where things fail fast and we try lots of things, but we take pride in publishing perfection. But there’s a problem in academia and it’s the same problem in high tech: money and ethics. When I was in undergrad, I was told by the professional association of engineers that software people will never get that P.eng stamp until the industry as a whole grows the fuck up. Ethics in technology also haven’t matured yet and we are seeing this most prominently with Facebook. How are these engineers at Facebook considering their effects on human beings and society as a whole? Seems an afterthought where the real focus is on building cool shit and getting page views and ad revenues. 1. 4 software people will never get that P.eng stamp until the industry as a whole grows the fuck up. There are those of us eagerly awaiting that day, but it is not here yet. We’re too enamored with building shacks to even fathom building cathedrals, and so we shy away from anything that’d help us do that. 1. 2 Just curious, is there some degree of promise of few breaking changes for newer versions that makes this version a better one to dive in on than others? The debugger is pretty cool, but the 0.x has kept me from trying Elm on a bigger project than just toying around with it. 1. 11 Right now, we’re working on tooling to automate upgrading. For example, for this release, you can use elm-format with the --upgrade flag to automatically update the syntax. Together with elm-upgrade, you can automate upgrading your deps. 0.16 -> 0.17 was a big breaking change. I foresee any other future releases will have substantial tooling for upgrades :) 1. 2 Not sure about any kind of promises. First of all, there were a lot of changes in 0.17, but upgrade from 0.16 was not that hard. Upgrade from 0.17 to 0.18 should be much easier and pretty straightforward, thanks to elm-format. Good news — for the next few months no major changes in Elm land. ;) 1. 6 The thesis here is “just hire random people (regardless of coding ability) and pay them all$30k a year; software will happen,” right? Does that actually match anyone’s experience? Wouldn’t anyone who discovers that they’re decent immediately leave to make twice, three times, or four times more money? Wouldn’t you have an almost immediate dead sea effect in which the only people staying in your company were the people you hired for their “soft skills” who proved useless except for their pleasant small talk in meetings?

1. 3

Well, they’re not going to stay at $30K a year. The proposal explicitly states increasing their salary as they advance in the program. But yeah, I’m pretty skeptical this could ever work. First, because there are very few companies who can afford to spend two years training someone with no prior dev experience. Second, because it assumes you can teach anyone to code. Some people just won’t take to it. Third because, as you said, you have no way to incentivize them to stay once they’ve completed their training and become much more employable. 1. 1 One of the things that seemed strangest to me about this proposal was the complete neglect of the people already at a company. It takes a lot of work to get a senior developer in place without serious disruption in a small team. I can’t even imagine what bringing in multiple people without experience would do. That said, I think it’s easy to imagine there’s something spectacularly unique about programming which precludes people who want to keep a job from improving at it. I really don’t think that’s true and it’s one of the reasons I shared this in the first place. It would definitely take a high degree of humility and patience to execute any part of this, but I think the main thrust of the post (open hiring to a broader pool of applicants) isn’t far off from something to work toward 1. 3 This rate will increase by$10k every six months as they progress through a two-year program. These raises aren’t arbitrary, they represent the actual value the employee is providing.

This doesn’t indicate that they’ve accounted for the fixed costs of a full-time employee (hiring, insurance, unemployment, office space, equipment, benefits). These costs are the same for a raw junior dev and an expert senior dev.

This article also does not acknowledge the fact this strategy is going to have expensive and difficult firing.

1. 2

This was definitely the part of the article that seemed most short-sighted or at least most lacking in an understanding of business operations.

1. 1

Great to see companies thinking about short circuiting broken education and hiring systems by taking on apprentices.

Any lobsters at companies doing this? Anyone seen more formalised or long term apprentice programs, perhaps with some structured learning, mentoring, etc. ?

1. 2

I work at DNSimple, so I’m at least one lobster who’s seen this in action. ;-)

1. 5

I was really excited when I saw that a version of f.lux was available for non-jailbroken iOS devices, as I thought this was a great use of the ability for anyone to install applications via Xcode. I’d like to know which portion of the Developer Program Agreement this has violated since I’d been hoping more things like this would start being released.

1. 3

Apple does not provide an official API with the functionality they need. They used undocumented APIs which developers are apparently not supposed to use (I wonder why apple doesn’t just disable the use of those APIs through technical means?)

1. 1

I think this is pretty interesting, but lobsters might not be the best place to share it. Generic business content doesn’t need lobsters to aggregate it, I’d say–there are plenty of other outlets. While we’re keeping lobsters' high signal to noise ratio, let’s keep this off the radar.

1. 1

As far as I can tell, this seems to pretty clearly fit the tag description of “Development and business practices”. Am I missing the purpose of this tag?

1. 2

It’s not engineering-specific. Because engineering touches pretty much everything in the world, if we included everything that could be related to engineering, we’d just become reddit / digg / slashdot / HN. The reason why most of us are on lobsters is because the signal to noise ratio is pretty high. Keeping it high requires pruning.

I shouldn’t debate the specific tag (in theory, that’s what the “suggest”) button is for, but if you compare it against the other things in practices, pretty much the rest are clearly related to software or hardware.

1. 2

Would you similarly raise this point on the thread of the top article about engineering salaries (also tagged “practices”), the comments of which are filled with political discussions about unions and guilds?

Communication styles and the difficulties / risks of authentic discourse, in particular, seems to be very pertinent to the weaknesses of engineering teams and management, in my experience.

1. 1

No, I don’t think we should police comments–I think being careful about the stories that we admit is sufficient. Note that that story is about engineering salaries.

I agree that they are, but I don’t want lobsters to turn into what hacker news turned into. We already have more than enough of those.

1. 4

I don’t think I agree with this post entirely, but it raises an important question. Are programmers foot soldiers or commanders? I think most programmers consider themselves closer to commanders. They are involved in technical decisions and the creative process of programming. I think many executives consider programemrs foot soldiers. They do what they are told by those above them. Being foot soldiers, it means you can blame them for when something bad happens without worry, like in war.

1. 13

I don’t know, I consider myself more like a private or an NCO: I sometimes get to decide how I do something, but I never get to decide what I do.

1. 7

A large part of this dichotomy depends on the culture of the company in which the developers work. I’ve been in positions where I’ve been a foot soldier and others where I’ve been a commander without a change in job title. Mr. Martin’s point about the need for a professional board of some point seems the most salient given it could determine who is responsible for what or at least give some guidance on the matter.

1. 1

The problem with a professional board with programmers is I don’t see how it would work. One of the biggest strengths, according to many at least, of programming is anyone can do it. It’s not like being a doctor or a lawyer where you need a license to do it. Maybe, if enough VW situations happen, we’ll be forced to get a board.

1. 1

I understand the concern, but a benefit to having a professional board is that there could be a defined meaning for what it is to be a programmer and thus provide a better way into the profession than companies just taking a chance on someone. It could also define an apprenticeship program which would allow new practitioners to enter the field, even with minimal background.

1. 1

Strip off the “Test Double” and “Our Thinking” from the title?

1. 1

I’d also suggest adding the year of publication.

1. 7

I’d been meaning to contribute to Rust for a while, and this is proving to be a good way to get familiar with internals.

1. 1

Great post, Steve. I wonder if striation may actually be the thing to save us more than smoothness (I may have misunderstood the distinction though, so please bear with me). What I mean is that it seems you (and Deleuze) paint smooth space as an ideal and striation as an evil or at least something to be avoided. However, it is precisely striation which would allow a community to form. If a space were purely smooth, only nomads exist, but in a sense, there needs to be some fixedness and continuity (smooth striation?) to keep something together. I think that’s what you started to touch on at the end of the post when you talked about paid vs. free use of a product, but perhaps not. Anyway, thanks for a great read!