I’ve finally finished reading Seeing Like A State, which is an anarchist critique of high modernism, or the practice of trying to remake the world to adhere to a few solid unifying principles rather than embracing the important localities in a system. It’s a fantastic read, though a bit of a slog.
I’m currently reading Debt: The First 5000 Years, but am too early in for any strong impressions, although Graeber’s BBC audio documentary was very good.
[Comment removed by author]
I am a heavy org-mode user too, one feature I use a lot is capture templates, actually. I have one for taking meeting notes, another for recording project decisions, and another for any arbitrary notes I think are worth remembering.
The one thing I really dislike about org-mode is that I could never find a decent CLI app for it. Sometimes I just wanna query from the command line without firing up emacs. If someone has a decent utility to talk to org-mode files without org-mode, please please send it my way.
This might seem a bit obvious but the best support for org-mode is always going to be within Emacs. So why not write whatever is it that you’re after as a eshell/emacs script that you run from the command line?
I’ve alliased vi to emacsclient -t it launches an emacs server on the first call the next call will open it almost instantaneously. Perhaps it would be good enough for you.
Some suggestions to make this easier:
grep -E '^\*+ TODO' has always been good for me as my “todo” alias.org-batch-agenda-csv can be used along with pulling your org-agenda-files and agenda config into a separate file, and you can use that with other CLI tools fairly easily.They’re only textfiles, I think state changes are the hard part without emacs
I think the beginning of your last sentence is really the key for me. “They’re only text files”. I can script the hell out of those, as it turns out. Onwards!
Seriously though, one of my main cases is to have a way to accumulate notes easily. I always have an emacs instance running too. I think my problem might abstract itself away, in the end.
what kinds of queries? you can always run emacs with a one off command against a file. alias that in your shell.
see:
https://www.emacswiki.org/emacs/BatchMode
https://emacs.stackexchange.com/questions/18111/query-the-org-agenda-from-the-commandline#18119
you could also just do a simple ag or grep search as org files are all just text depending on what you want to do.
I love org-mode and consistently feel like I’ve barely scratched the surface. You can start with a really simple workflow and build it out as you see fit. Occasionally I end up in a yak shave, like trying to sync with google calendar or jira or some other part of the outside world, but yeah emacs + org-mode + (some file syncing service) has served me well.
I also use org-mode with deft.
My ~/.deft is a symlink to a folder in my iCloud drive so my notes are always sync and saved.
Furthermore I also append .gpg to the file name so my notes are encrypted with my gpg.
Org-mode is also what I use. I’ve tried many different tools, but come back to org-mode each time - it’s worth learning emacs basics just for org-mode. For example, I love how I can get a myriad of different views and reports on how I spend my time.
Here is an example clock report from a few weeks ago. (The far-right column, unlabeled, is actually a custom calculation, my estimated hourly-rate multiplied by how much time I spent on the task. I use this to estimate how “valuable” a task is. It’s not a perfect metric, but I find it’s better than just time-spent on a task.)
I try to put everything in org-mode, and the more I do so, the more organized I get and the more I get done. Like (I think) Peter Drucker said “what gets measured, gets managed.” Org-mode is my manager-self, so my engineer-self can actually get work done without worrying about which work is important.
There is no such thing as a system without dependencies, there are only organisational tradeoffs as to how they are managed.
[Comment removed by author]
abstraction is a core concept of programming.
is not the same thing as:
abstraction is the core concept of programming.
Too often someone proposes a code change that is “more abstract” without demonstrating what value it is: Why is:
area(R) -> 3.1415 * R * R.
worse than:
-define(PI, 3.1415).
area(R) -> ?PI * R * R.
Do you think that someone is going to change the value of PI?
Forth programmers don’t bother; if they used a PI constant then they would miss seeing easy opportunities for using fixed-point arithmetic:
: area dup 5632 * * 7 / ;
Abstraction is an important concept, but it is not our goal and should never be confused with our goal: To solve business problems efficiently and correctly we miss opportunities and make it harder to be successful if we abstract too much. Knowing when not to abstract an interface is almost more important than knowing how to abstract it.
Sandy Metz had a related and very good talk about this. The central thrust was “a little duplication is a lot less costly than the wrong abstraction”. Having internalized that abstraction is a core programming concept, we’ve all become a bit too knee-jerk about applying it instantly everywhere we could possibly fit it, and then pay the price in long-term maintainability when it turns out our abstraction was premature and actually ends up getting in our way.
A little duplication is a lot less costly than the wrong abstraction.
I have a feeling the OP takes DRY too seriously and ends up with insane abstractions all over the place. @dwc’s assessment is accurate: use it when you need it.
Personally, I let repetition sit around until I find it’s too complicated and then I’ll turn it into an abstraction. It’s easier to find the useful patterns after you repeat yourself for a while.
Personally, I let repetition sit around until I find it’s too complicated and then I’ll turn it into an abstraction. It’s easier to find the useful patterns after you repeat yourself for a while.
This is sensible, but note there’s an element of taste here that makes following this advice difficult for junior developers (NB the language you used “too complicated” – how do they know if it’s too complicated?)
I find anything that makes the program shorter as defined by source-code bytes is a better mechanism for identifying when to introduce abstraction or any other kind of functional utility. Yes, some people want to argue about lines or words or lexemes or whatnot, but I find we can usually keep that part of the discussion (and it’s advantages/disadvantages) separate.
I find anything that makes the program shorter as defined by source-code bytes is a better mechanism…
That’s a reasonable metric to work from and a good way to guide junior devs.
“Too complicated” is one of those things that you get a feel for through experience. Certainly making the code smaller is a good thing, but there may be something outside of the code (like say, debugging/monitoring) that warrants a little bit of bloat. (Maybe. I would proceed with caution but not rule it out.)
Also, with letting the repetition sit, you tend to figure out what a decent abstraction would be after repeating it a few times. After thinking about what I wrote a bit more, I realized that’s the real value in waiting to implement an abstraction: actually seeing the pattern instead of imagining it.
In the first example, pi is well known. In the second case, whats 5632 doing there, and where did it come from? But even in the first case: That specific value of pi is fairly likely to change, because it’s dropping a lot of precision.
Regardless, giving a value a name is roughly the same as giving it a descriptive comment. It’s documentation, not abstraction.
Rubbish. If the value of “pi is fairly likely to change” you’re focusing on something other than solving the business problem correctly and quickly. As a trivial and likely example: Perhaps there’s a big friendly comment above there explaining we need an estimate of the number of screens that are only 100 pixels wide. In such a case, changing the value of PI is worse than a waste of time, it makes the program slower and perhaps less correct.
But by all means, argue with what you choose to imagine I’m saying instead of what I’m actually saying.
Awww man, but I can never remember the Erlang gregorian seconds to epoch delta and fed up of having to cut’n’paste it.
I mean, you could have area(R), and circum(R) and volume(R), and then decide to switch to have more/less precision in PI. But I’d only introduce PI once there was more than one, preferably 3+, places that will use it.
I think you’ve covered this wonderfully.
Often we want to try to remove any part of a problem that’s difficult and inevitable. Abstraction tries to address enforcing assumptions of a model while minimising the attack surface of the system’s environment. It’s a tricky solution to a hard problem, and we’ll necessarily get it wrong (there’s no such thing as a perfect set of assumptions, only a sufficient set).
Throwing away a toolkit because it doesn’t cleanly and formally solve our problem without informal knowledge and iteration is a bit like refusing to wash dishes because they’ll be there tomorrow.
Abstraction is not the set of mechanisms that allow you to define interfaces in a language, it’s the set of assumptions, implicit and explicit, that we try to enforce in our system.
So this might seem a bit glib but I mean it in all seriousness when I say it: The actor model is a bad fit when you don’t need concurrency to solve your problem.
If you can solve your problem synchronously and in a single thread, the added complexity of actors is probably not terribly useful for your purposes. All of this holds true for concurrency models in general.
With a concurrent system, you now have to ensure you don’t have problems with nondeterminism. (If your actors message a lot in both directions you can start to have issues with ordering).
With the actor model in particular, there’s also additional complexity in error handling specifically. “Let it Crash” is a good philosophy, but it’s important to determine which errors should be fatal for an actor and which ones are recoverable.
Everything I’ve said about concurrency is double true for distribution, if you’re building a distributed system with the actor model you get a lot of benefits that can reduce the difficulty of working in distributed systems, but you’ve also taken on all of the complexity of a distributed system.
I would like to point out “Let it Crash” is not inherint to the actor model. It is an aspect of certain implementations including Erlang which is the most popular implementation.
I think there’s another reason Clojurists have macro allergies that we’re less inclined to talk about:
We don’t have very many tools for understanding our code while it’s running. Good debuggers are relatively recent, so many of us rely primarily on two things: tests and stack traces.
Stack traces are ugly, but we’ve learned to make the most of them.
Code gen, of any kind, runs the risk of making stack traces harder to read, so it’s worth evaluating a DSL based on how much it muddies the waters of our program.
So yeah, they provide additional complexity, but most of us have tried at some point to make use of macros and I don’t try to avoid them because of a design principle so much as I avoid them because they sometimes hurt.
I haven’t written much CL or Scheme so I can’t say whether this applies to those languages, but I think the hosted nature of clojure probably makes it hurt a little more? Hard to say.
I wish this were true, but in practice I don’t see “will this make stack traces worse” factoring into people’s decisions much. Real discussion about improving stack traces tends to get drowned out in the noise between “this is just unbearably bad” and “no everything is fine you just need to suck it up”.
Yeah I think I’ve seen this in a lot of languages I work in, too. I’ve seen similar issues with Ruby metaprogramming (where an entire file is generated by macros and there’s no obvious way to ‘read the code’, mostly just to avoid typing.
I think obscuring our reasoning with a more general statement that it ‘adds complexity’ doesn’t help convey why macros bug us.
And you’re right about the results. I think this happens a lot with ‘developer discipline’ issues where the two extremes are to argue for better tooling or to tell newer people to ‘just learn it’.
I’d be super interested to hear other people’s accessibility setups!
Was one of those guys that thought that simply mapping Caps Lock -> Ctrl would be enough to save my hands from Emacs. After 10 or so years of it my pinkies started to give out. Looked into other keyboard layouts, found that many of them (such as Dvorak) actually increased stress on the pinkies because they optimized solely for travel distance. Eventually went with carpalx’s QFMLWY layout, switched to the Kinesis Advantage which has all of the modifiers under the thumb, and completely re-did all of my Emacs keybindings. Also own Kinesis foot pedals but I don’t use them anymore.
Took me about a month to adjust. Used to type 150 WPM in my prime, would say I type about 110 WPM now. Still going strong after 6 years.
Fairly extensive. One of my favorite ways of procrastinating is messing with my editor, or brainstorming ideas for new editors/editing paradigms. So before the pinky crisis happened I already had some helper functions for overriding bindings in my init.el and had done some experiments with my bindings.
The Kinesis has arrow keys conveniently located under my index and middle fingers so that freed up C-n, C-p, C-f, and C-b. I used keyboard-translate to free up C-i, C-m, C-[, and C-] which are usually unavailable because of terminal constraints. I only use Emacs in GUI mode so that wasn’t a big concern for me. The core of the remapping process involved me writing down all of my favorite commands, printing out the Kinesis layout with the QFMLWY keys, and manually mapping things such that they were convenient to reach w/o my pinkies using mnemonic sense as a tiebreaker.
Some examples: Instead of C-/ for undo which is painful for me, I use C-u. TAB (completion, snippets) is another painful pinky key so I use C-a. C-h (help) is remapped to C-i.
I also avoid binders that involve releasing modifiers, like C-x o (switch window). I don’t have that kind of precision when using the foot pedals. Anything like that gets remapped to be completely w/o modifiers or completely with. So C-x C-o is ok. I went with C-b. I don’t use the foot pedals anymore but I’ve grown fond of the rule and C-b in particular so I haven’t changed any of original remaps, but I’m no longer as vigilant about modifying new minor modes.
Have you tried spacemacs or been at all inspired by using the spacebar as a leader? the symmetrical nature of the spacebar as leader is really nice. also, vim bindings are much, much more ergonomic than the default emacs bindings.
Slightly amazed you didn’t catch that, because Aliant doesn’t have v6 yet on residential lines. Or are you on Rogers, or a v6 tunnel?
I guess it’s well covered but I think that this particular viewpoint can go pound sand.
“Oh sweetie you shouldn’t do this because you’ll have a bad time” is easy shorthand for “our culture isn’t built for you and isn’t about to change that”.
It precludes the possibility of late bloomers, people who do something for awhile and find a reason to love it.
It embraces the philosophy that people should Do What They Love, which I don’t think is a real possibility for anybody who needs a roof and a meal.
“Your vision won’t matter” is a valid criticism of most jobs.
The author identifies a lot of current problems with the industry, I just think we should work towards the opposite conclusion.
I remember seeing this a while back. Is it actually still actively developed? I see a recently merged PR for the stdlib, but other than that it doesn’t look like there has been much activity lately. Given that the docs say it is in a “pre-alpha” stage, I assume that this isn’t a case of “no commits because the software is ‘done’,” but rather just a dev’s loss of interest in the project.
The author of it posted an update in the HN discussion a few days ago. Excerpt:
I put about a year of work into this language, and then moved on about a year or so ago. One of the biggest reasons for my doing so is that I accomplished what I was looking for: a fast lisp that favored immutability and was built on the RPython toolchain (same as PyPy). But in the end the lack of supporting libraries and ecosystem became a battle I no longer wanted to fight.
[…]
Some other developers have commit rights and have pushed it along a bit, but I think it’s somewhat a language looking for usecase. And these days ClojureScript on Node.js could probably be made to handle most peoples needs.
Not entirely sure, I had a bug fix approved. I suspect for major feature development there’d have to be some IRC legwork.
I’ve finished a bit of an introductory series about error handling and core.async and now I’m taking a crack at process control and some actor-esque stuff. There’s not much there yet but I’ve named it extra. Not quite actors, but trying to be. More code, docs, etc to follow.
For reference, re: the title http://m.youtube.com/watch?v=4fG7LzTDmJM
More about changes in the basketball metagame: http://fivethirtyeight.com/features/stephen-curry-is-the-revolution/
Is Curry the product of changes in training, athleticism, and strategy? Is it possible to build more players like him? If a team focuses on draining threes, what’s the most viable response to that?
This is a well-written thoughtful article, but I would like to add a rule.
Most of these are excellent pieces of advice, and practices I generally try to adhere to. I don’t use schema as much as I used to, but timbre is fantastic, clojure.test is great, and all of the core points are excellent. I’ve never had cause to use agents or STM, and atoms are best used when enclosing some stateful function.
core.async is also great, but not without caveats, and it also has some strong footgun potential. I’ve written a little bit on the subject here and here, with more coming, but the tl;dr is that core.async will swallow exceptions by default and there are a few other gotchas.
This is a good write-up. I often find with Clojure, because it’s so flexible but also because it’s dynamically typed (which is good and bad) that it takes a bit of mental legwork to get to the point where I feel like I’m using it correctly. I also haven’t been using it continuously or following the community, and best practices change so fast. It’s great to be able to get some insight from someone who’s been using it in production for 7 years.
Has anyone (I imagine the OP hasn’t) tried out Typed Clojure? How has that worked out for people?
FWIW, the author of the piece is the (a?) founder of CircleCI, which had one of the most extensive integrations with Core.typed in existence (I believe) up until last fall: https://circleci.com/blog/why-were-no-longer-using-core-typed/
Typed Clojure is cool, but it has a ways to go. Gradual typing is a work in progress at present, and I haven’t played with the stuff that’s in development right now, so when we type a library for internal consistency that works pretty well, but dealing with inference on Other People’s Code is more difficult.
Matthias Felleisen did a talk at Clojure/West about the benefits of type systems in dynamic language and outlined a few of the shortcomings of core.typed.
That said, Schema is a nice happy medium, but necessitates that you pair schema definitions with tests most of the time (and/or, run validation while you’re developing). Runtime validation is still pretty useful, and thinking about the shape of your data is always a benefit.
I was wondering. Continue to stand by decision to pass on the industry and to recommend young programmers to do the same.
I’d rather make a business dude rich and go play with my dogs at 5 pm. That’s not mediocrity, that’s knowing a video game doesn’t change the world.
Even if it did change the world, your life is important too. I like what I do, it feels important to me. But I only put in 40 hours a week, most of the time, and I stand by that.
Smart and well balanced employees do significantly better work long-term than ones churning out code 16 hours a day.
I’ve been looking for a job and while there is a lot of postings in tech, I dont see that many outside the industry. Most of my contract work up to now has mostly shown me that people outside the tech industry arent really aware of the costs, most offers being magnitudes lower than what I see in tech companies.
And I’d love to be able to work with the outdoor companies I’ve grown to know here in Quebec.
If you know people in a non-tech industry who have a problem that can be solved with software and are willing to pay significant money for it to be solved then maybe you don’t need to work for them for them to pay you. You might be sitting on one of them “sales channels”.
Continue to stand by decision to pass on the industry and to recommend young programmers to do the same.
So where do you tell them to work? There don’t seem to be very many meaty jobs, from a programmer’s perspective, outside of the technology industry. Or do you suggest that they be hobbyist programmers only and use their technical skills to segue into pre-executive jobs?
I ask this because I think there’s mutual benefit in helping great programmers get out of the technology industry while still using their skills. It’s not good for society to have the best programmers in one industry, and it’s an industry that tends to take us for granted and to treat us poorly. The thing is: I don’t know how to go about it, much less solve the problem at scale. I suppose that it would require a fleet of agents who act as tech-industry exit consultants.
No, of course not, but I thought you might have something useful to say, or some insight. That’s why I asked you the question. Apparently, I was wrong.
It’s how most technology managers/executives and almost all of the VCs think. OP is just uncouth enough to express things that others would never say (such as the disgusting gendered shit that assumes that programmers burn out because of “wives/GFs”) in public.
Even when it’s true that “everyone thinks X, he’s just honest enough to say it”, that act of saying X makes things worse since it further normalizes the situation.
The Moldbug controversy ties in to #2. His views are disgusting. That said, the reason there’s such a push to remove him from the conference circuit isn’t just that his politics are awful, but also that he’s accessible. The billionaires who run Silicon Valley largely share his views. They just aren’t stupid enough to get caught. (“Mencius Moldbug” was a pseduonym that got doxxed.) Also, enough people want their money that they can get away with pretty much anything, just like Trump said about his hypothetical 5th Avenue murder.
I certainly don’t wish to excuse bad behavior or exclusionary viewpoints. I just wish there was more consistency in it. I probably wouldn’t invite Yarvin to speak if it were my conference. But how many people would turn away a billionaire venture capitalist who dislikes the 19th Amendment (Thiel) or that liberals are capable of Kristallnacht (Perkins)? I’m guessing that most of the tech industry wouldn’t.
[Comment removed by author]
It’s hard to know someone’s intent and, in that light, it’s hard to know what his true views are.
His self-presentation is of an intellectual who simply isn’t willing to reject monarchy, slavery, white supremacy, and other unfashionable (in many cases, because they are bad) political institutions out of hand. Just as there are useful alternative logics (e.g. non-Euclidean geometry, intuitionist logic, non-ZFC set theories) I suppose he is trying to start from first principles, with no assumptions, and derive an alternative politics. At least, that’s how he wants to present himself: a free thinker on the right, unconstrained by conventional humanist assumptions.
Part of the problem, I think, is that he’s either disingenuous or sophomoric. For example, he claims that Europeans preferred African slaves because they were “better adapted” to slavery than Native Americans. In fact, they were only better adapted to the Southern climate (35 C summers, high humidity) because the Natives are descended from Northeast Asians (hence, the most successful Mesoamerican civilizations were at altitude). He’s remarkably willing to accept bad ideas, and his reading of history is superficial and weird.
He might be an obscurantist “dog whistle” racist. He might just be (as you suggest is possible) coming off as aloof and lacking empathy. He’s certainly contributed to an ideological movement that harbors actual racists. Also, slavery is outright evil regardless of whether it’s racially based. (African-American slavery was an especially disgusting brand of it, but slavery has existed since antiquity, and exists today, in a variety of formats.)
I’d feel differently if he disavowed Moldbug. Look, I’ve created (more in jest than toward any serious effect) offensive internet characters. If he said, “I was full of shit back then”, I’d like to believe that many people would forgive him. However, he hasn’t. Even worse, he claims to have named his daughter after a pro-slavery man-of-letters, Thomas Carlyle. That just makes me ill.
[Comment removed by author]
Does it not make sense that the guideline is individual to each conference? I have a really hard time buying the exclusion or lack thereof of Yarvin as the first step in a slippery slope. Strange Loop removed him, pure and simple, no discussion, and Everything Was Fine. Lambdaconf didn’t, some people pulled out, and the show will still go on. Conferences have a right to their rules and their attendees and supporters also have that right. That we fight about it on the internet is no indication that anyone is winning anything from social pressure.
William Shockley invented the semiconductor. He won a nobel prize. After that he became an outspoken advocate for eugenics and some pretty brutal racist policies. His SPLC file is here. He suffered, personally and professionally, for his views. He lost friends and colleagues over these views.
My own opinions about Yarvin aside (I think Moldbug’s ideas are repugnant and dressing up plain-ol' bigotry with equivocation doesn’t make them any less repugnant), I find it highly unlikely that conferences will standardize in this regard. If a large enough people disagree with how conferences are handled, there’s nothing stopping them from hosting whomever they’d like.
Splitting hairs about what we individually find appropriate is our own business. Do we let a Klan member or donor speak so long as they don’t bring their robes? Who cares? I think it’s important for each of us to determine what we think we should support and act accordingly. If a lot of us feel that way about a person, they won’t have a platform at conferences. There’s nothing about that process that’s undemocratic or unfair.
Strange Loop removed him, pure and simple, no discussion, and Everything Was Fine.
The nature of slippery slopes is that they don’t show up immediately.
That we fight about it on the internet is no indication that anyone is winning anything from social pressure. I think it’s important for each of us to determine what we think we should support and act accordingly. If a lot of us feel that way about a person, they won’t have a platform at conferences. There’s nothing about that process that’s undemocratic or unfair.
If it’s the democratic decision of the conference attendees to exclude someone I’m fine with that. If it’s some vocal/famous people on Twitter whipping up a mob of people who aren’t even going I’m a lot less fine with that, and that I do think is undemocratic.
By the first quarter I was thinking, this has got to be satire… But reading his bio makes me understand why.
Continuing my series on core.async error handling with the greater goal of working into some more general work on error handling in concurrent systems. It’s going well!
I just finished a staycation that involved a lot of noodling around, some programming, and a LOT of self-administered cognitive behavioural therapy, which I think went pretty well.
https://www.microsoft.com/en-us/research/publication/orleans-distributed-virtual-actors-for-programmability-and-scalability/
The paper is here and I quite like it.