What I would give my proverbial left foot for is something with this post-Vim philosophy of composability/extensibility that still played nice with the universal-ish standards (ctrl-x for copy, etc) out of the box as a first design principle? I’ve used Vim for years, poorly, because I also use a lot of other apps (e.g. Zim-wiki is my default for most everything). I guess I wish these really good configuration ideas could come without these arguably radical new ways of keyboard editing?
I am not sure what you are after. Do you want a modal editor with Windows-Style keyboard short cuts or what is it that you are looking for?
An advantage of using MacOS is that you use the Super key for those so the two play really well. It would be neat if I could have all my apps respect a change like that, but because of different UI toolkits on Linux and no single keyboard shortcut enforcer, this tends to not be the case.
I bet there’s an XDG standard on this somewhere.
My current middle ground is moving standard editing keys, like arrows, home, end, delete, backspace, PgUp and PgDown, to the home row of my keyboard (using QMK firmware on desktop and Kanata on laptop).
That way, I get reasonable efficient editing everywhere.
For programming, that combined with multiple cursors and syntax-aware selection gives a pretty decent code editing experience.
While I wouldn’t recommend it, i think you could bind ctrl-c/x/v to yank/cut/paste current selection to/from system clipboard in helix? (Currently space y/p - for copy paste - unsure about cut (to clipboard) - might be bound to space x?)
It’s interesting, I think everyone goes through phases when it comes to maintaining a website, a parallel to their life:
Thanks for the talk cadey :D
I just stuff my markdown files into zola (static site gen), copy them via SSHFS onto my host and let nginx take over. Doesn’t look as nice as I’d like to (my theme isn’t the best, still hoping for tree sitter in zola etc), or nice features like stats are missing. But what ever, not gonna fall down the mentioned rabbit hole ;)
you go back to the basics
What would you say are the basics? I am going through similar phases. I would like to learn from the wisdom of those who have gone through these phases and I would like to know what the final phase looks like. Would you say that writing your posts and pages in plain HTML or Markdown and rendering them with a small static site generator is going back to the basics? Or do you mean something even less complicated?
The final phase looks different to everyone, but generally, yes, it’s something where you 1. write 2. upload. My current setup is: write a txt or html file in ~/Wiki/public/, then deploy.sh (which is just rsync really). My deployment script makes sure nginx is installed and the config file is there too.
Do you have another step where you add common header and footer to all pages? That’s the reason I am still using a static site generator but I guess it should be easy to solve using a simple shell script too. Like deploy.sh adding a header and footer to every file before rsyncing it. What do you do for common headers and footers?
For me it was Wordpress+MySQL. Then my bloody provider turned off the server after 10 years and faced with the daunting task of migrating everything, I backed it all up and just never brought it back up.
There’s a 7th step where you abandon the Web for Gemini ;-P My website is now basically just a resume and a link to my Gemini capsule.
I had various systems over the years, but at certain point I decided I wanted to take the Tumblr posts I had made about programming and separate them from the others, so I made a Hugo static site out of them, and then I started using Hugo for everything at work because I now knew Hugo really well. I think there’s definitely a lot of cross pollination between work and personal projects.
I’m somewhere between 5 and 6. Right now I’m writing in Markdown, and I have a shell script that runs pandoc on all the markdown files to generate a static site, and I kinda like it. The only problem with it now is that it regenerates everything every time, so I’ve been toying with the idea of changing the shell script to something weird like a Makefile.
I’ve found that when I’m burning out, I gravitate toward very old school tools like I had when I was first beginning my career; I think that’s where the Makefile is coming from.
I’m running through all the steps consistently and constantly, but I never do any actual blogging. It’s fun to set up things, but I rarely have a feeling I have worthwhile thoughts, so i give up after hello world post, or maybe two more.
I don’t know the performance of Pandoc, nor the amount of content you have, but my simple microblog has almost 10K entries and renders in less than 10s. I use the Perl interface to CommonMark to generate HTML.
it’s slow-ish for me right now because I’m lazy, and I’ve embedded the images with data: URLs until I can come up with something less brute force. :) It seems like the process of turning graphics into huge data: URLs (unsuprisingly) adds a lot of time.
Woah, Dhall looks sick. The only problem with these things, though, is that JSON is pretty universally supported, but all of these have generators/validators at various degrees of functionality. And I don’t want to shell out to run some things to dynamically regenerate.
I really like JSX and React’s model and Redux for state. I only dabble in web development, however. Is state-management easy in server-side react? What has been people’s experience doing React server-side with state?
SQLite has so far never let me down, and enabled some things that would be otherwise in the “almost impossible” category. Still come across some amazing new feature that I had no idea about regularly (like the session extension), as well as regularly releasing big new features while remaining 100% backwards compatible always. Upgrading SQLite is a real joy: just replace sqlite3.{c,h} with the latest one, and eh yeah, that’s about it…
fish is a pleasure to use as a daily shell - very thankful that someone has taken on the very thankless task of making a new shell and actually thinking about the ergonomics first (“Finally, a command line shell for the 90s” is the perfect slogan for this)
I became a “fisher” past year and I haven’t had a reason to regret that move. Contributed some missing auto-completions to it as well.
I love Fish ergonomics, but actually wish it was Bash-compatible. It’s just a bit pain when you need to integrate scripts that do not have a native Fish version, it’s slower and does often not quite work (even with Bass). Nu is another shell to look at, with even more radical choices.
i was just about to write “you should write an article about how you use autohotkey” because it seems like it would be interesting
then i found https://www.hillelwayne.com/post/ahk/
I really like fish
. Sadly I use reverse-incremental search very often in zsh
and the lack of it makes fish
too hard for me to use. fish
has very good reverse-prefix-incremental search but since I commonly aim to resurrect commands from history from mid-line partial-match, it doesn’t work for my use-case.
I was using fish
+ Starship and now I use zsh
+ oh-my-zsh + Starship and it’s good enough.
I use fzf
to fill this hole in fish, via its ctrl-r binding
I have been using bash
with the vi editing mode for years and it was only last month that I learned that you can press / in command mode to search command history. That’s completely changed how I use the shell (I can never remember the normal history search shortcut).
FreeBSD 14 is replacing csh with sh as the default shell and the up-arrow behaviour from csh was the big feature that everyone insisted sh needed before it could replace csh (FreeBSD’s csh up-arrow searched for things in the history matching the typed prefix). I still wouldn’t choose to use either as a daily driver, but they’re now both fine for occasional use.
Impressed you are able to use bash vi mode. I find it rather unintuitive and prefer Emacs mode but with C-x C-e to get into nvim or when I’m in Neovim’s terminal, just going to Normal mode.
Strange because I have vim everywhere else. Just not here.
I’ve written four books, a PhD thesis, a dozen papers, and at least a couple of hundred thousand lines of code in vim. At this point, convincing my fingers that any text input field is not vim is probably a lost cause. I will probably end this post by typing :wq and then have to delete it and hit post. Even if I don’t have vim mode, I hit b to go back in the command line and then have to delete the ^B.
Yep, sqlite is the first thing that came to mind when reading this thread. I’ve done a lot of data wrangling over the past year, and sqlite has come in clutch so many times.
It pairs well with pandas!
Karabiner for me.
Karabiner Elements is so essential to my life. I use Caps Lock as Esc, and on Mac OS that is impossible without Karabiner. Other remaps add a 100 ms delay before you can type after you hit Caps Lock, and that makes my constant-insert/normal-swaps very annoying.
I use Karabiner sometimes as well, but can’t you remap Caps Lock as Esc on a Mac without it? (System Preferences -> Keyboard -> Modifier Keys)
I don’t often want an image on Lobster.rs, but today I do.
I’m on Big Sur, and I have Caps Lock mapped to Control, but I could map it to Esc natively just with System Preferences. I can’t recall, but was Esc not one of the options for key remapping in earlier versions of macOS?
It wasn’t one of the options earlier, but the real problem is that at least some users (including me) have this problem where there is a noticeable delay on using the Caps Lock key for whatever purpose. i.e. if you attempt to just tap it, all keyboard input will not work for a short period. Since I have Karabiner Elements installed now, I can’t reproduce (KE removes the delay), but it was incredibly frustrating. I think it might have stalled all input, but I can’t recall now (and I am not eager to reproduce it haha).
One small piece of software I enjoy is entr which lets you “Run arbitrary commands when files change”. It is a great tool that is adaptable to many work flows due to its single focused function.
I keep losing this tool. I used to use inotifywait
on Linux, but a portable command is much nicer now that I use Mac OS as well.
I use fswatch, which has a memorable name and works on Linux, macOS, *BSD, Solaris, and Windows and doesn’t depend on anything else except a C++ standard library implementation.
Thanks for the tip. I have been looking for a simple and portable command like this, and entr
looks great.
See also watchexec. I’ve found it more reliable in my experience, but it was a few years ago that I switched from entr so that might have changed (I can’t remember what my issues were unfortunately).
I’ll add that I’ve written my own versions of this in powershell and Janet for use on Windows. “run commands when files change” is a game changer when it comes to general coding.
One piece of software that I’m super grateful for lately is jless. It’s crossed my radar once or twice in the past, but recently I’ve needed to navigate big json responses while testing and evaluating different APIs and it’s come in really handy.
For JSON, another thing I like is gron
which flattens JSON so you can do things like:
<file.json | gron | grep -C 5 "roshan" | gron --ungron
On a tip somewhere here I found the idea of making a norg
script that is gron --ungron
and that makes it even nicer to use.
I genuinely don’t know when it happened, but a few years ago when I was using Mac OS (on my laptop), Ubuntu on my desktop, and Windows on the desktop for games, I suddenly realized I used all the three systems without the slightest amount of context switching pain.
Sometimes I’d be doing things simultaneously on the Mac and the Linux desktop (latter was far more powerful for some data analysis but our VPN solution worked better on the former) and all the gestures and shortcuts were fairly automatic. I stopped to think and I was quite surprised that these two interfaces not like each other acted so seamlessly.
But it makes sense: IntelliJ is the same, the Terminal is roughly the same, the browser is the same. Anyway, I don’t spend too much effort on the DE now.
Company: Dexterity Capital
Company site: https://www.dexterity.capital/
Position(s): Senior Software Engineer
Location: ONSITE San Francisco, Seattle. We have offices in these two places.
Description: We’re a proprietary HFT fund trading crypto derivatives running market-neutral strategies. You’ll be working on making sure we collect high-quality data, supply our trading teams with good tools to trade, and doing everything that can be done to making the process of trading efficient.
Tech stack: Java, AWS+Terraform.
Compensation: Mostly in salary and bonus. We’re a small company (13 employees, 4 core engineers) so no engineer is just a brick in the wall, and likewise comp is easily adjusted up for the right person.
Contact: roshan@dexterity.capital or go right to the Greenhouse listing.
Your anchor tag for “incremental parsing library” is broken.
Thank you for the Telescope and Treesitter recommendations.
That said, git repositories only ever grow over time and nobody seems mind too much.
On my clone of Linux from September 1st, 2020 (commit 9c7d619be5a002ea29c172df5e3c1227c22cbb41
) the .git
folder is 1183 megs larger than the next biggest folder, drivers
:
ckie@cookiemonster ~/git/linux -> du -sBM * .git | sort -nr
1848M .git
665M drivers
134M arch
Shallow-cloning improves it but it’s still pretty big:
git clone --depth 10 file:///home/ckie/git/linux linux-shallow
ckie@cookiemonster ~/git/linux-shallow -> du -sBM .git
211M .git
I’m pretty lucky to have unlimited bandwidth and a (relatively) fast internet connection most of the time but not everyone can afford that, and, even for me, running git pull
(note, not the shallow checkout - I haven’t fixed the remotes on that one) still hasn’t finished. (Future note: internet speed randomly dropped to dialup speeds while writing this comment, so I can’t really run that test semi-reliably anymore)
It won’t save the bandwidth, but if you have flaky or unreliable connection, it may be much easier to download bundle of the linux git repo (and verify, clone it locally, add proper remote afterward) instead of cloning it over internet from scratch.
https://www.kernel.org/cloning-linux-from-a-bundle.html
https://git.kernel.org/pub/scm/linux/kernel/git/torvalds/linux.git/clone.bundle
It’s not prominently listed on kernel.org, so not everyone is aware of that.
Is the whole process faster if you download a tarball of the entire repo (.git
included) and then pull up to the latest or is a fetch and rebase some sort of O(n) operation in the depth of the commit tree?
< file.json | gron | grep -C 5 "mything" | norg
gives me nice JSON back again with norg
defined as
#!/usr/bin/env sh
gron --ungron "$@"
Some other smart programmer came up with it and then I stole it.
Good article. Just wanted to say I liked author’s format:
Makes sense to always use quotes for strings in YAML. After all, sometimes you want to represent the text “9.3” and sometimes the value 9.3 and sometimes the text “False” and sometimes the value False.
Lots of YAML formatters will deformat out the strings for you, which is a bit annoying, unfortunately.
Interesting. An unsynchronized un-backed-up file store is an SPOF. A synchronized un-backed-up file store is a multiple SPOF. Backing up gives you safety, automatic synchronization removes it.
This is incredible. I wonder if there is a legal effort to produce specs for Wine to have clean-room code.
Most likely, but it might be good to stay under the radar until the dust settles since Microsoft is already going after the guy who compiled the source code (see Glaeqen’s comment above).
They like to picture themselves as nice and open source friendly.
But they do not hesitate to enforce copyright on 15+ yo software.
I mean, MS aren’t selling XP any more, while books and songs still have value. I guess the most charitable explanation is that parts of this are still in Windows 10. Still, this angers my inner rms
XP is (probably) full of source code that MSFT paid other companies for and used with their permission. Even if they wanted to, they probably can’t release a working source tree of Windows XP without getting permission to do so from the other license holders. And for what? Giving people explicit permission to use a product that they no longer are interested in supporting? It’s all downsides.
Still, this angers my inner rms
I’m pretty sure RMS would see the unauthorized release of proprietary source code as wrong and unethical.
I’m pretty sure RMS would see the unauthorized release of proprietary source code as wrong and unethical.
Sorry, but this is my RMS, not yours
Anyway, I don’t care much really, but no-one is asking MS to support anything or give permission.
no-one is asking MS to support anything or give permission.
Indeed not, this is just a childish prank. Anyone with a cursory knowledge about how software licensing works (both proprietary and FLOSS) will steer well clear of this.
I’m pretty sure RMS would see the unauthorized release of proprietary source code as wrong and unethical.
I have my doubts, particularly if the binaries have been released beforehand.
Now, personally, in the case of Windows XP, and considering the amount of computers that depend on it (and were abandoned when Microsoft abandoned XP), I believe the regulator should step in and actually force microsoft to free the source code, in the name of balance of power between Microsoft and its users.
Creator rights and business rights should be protected, but not beyond what’s reasonable. In this situation, the public interest should weight far more, and the government should act thus.
This would be a compromise already, an alternative to forcing Microsoft to maintain Windows XP forever. With the freed source. Windows XP users could pool their money into maintaining XP themselves.
15 years
No, as I actually like the EU green’s proposal regarding copyright terms (5 year, extendable twice to 15yr by registering and paying a fee).
15 years is already plenty, in keeping the original spirit of copyright, which was to give authors a temporary monopoly, in the interest of the public domain.
With excessive copyright terms, the author gets little to no benefit, while the public domain suffers greatly.
the EU green’s proposal regarding copyright terms (5 year, extendable twice to 15yr by registering and paying a fee)
Do you have a source for that? A cursory Google shows up nothing of relevance.
Unfortunately not. And this is easily from 5~10 years ago.
I do not know what their current stance is, nor have I seen much activity in the topic (“copyfight”, pirate parties, etc) in a long time. Which saddens me.
I do however see that the greens still seem to care about the topic.
OK, I found something related but UK rather than EU.
Yeah, that actually meant “life + 14”:
The vision then goes on to propose “generally shorter copyright terms, with a usual maximum of 14 years”. By this, we mean that rather than the current maximum of 70 years after the creator’s death, it should only be 14 years after their death. Unfortunately, as written, this appears a bit ambiguous and has caused confusion, so it needs clearing up!
Noun-verb is much better than verb-noun. Something I’ve noticed as a Neovim user is that I tend towards visual-mode-noun-verb as a substitute for this for the most part.