Have you thought about using Netlify? It basically builds and publish the site for you, or even set up a CDN, file compression/aggregation. With a .netlify.toml in your repo you can set up zola version and build command, integrating directly with GitHub.
I’ve been using it for a few months, although recently I’ve been moving my services to my vps and will probably end up just writing a server side git hook to handle this process there.
I started in 2011 with Ubuntu to discover what Linux is, and switched to Arch in 2012 do go further into details.
When the distro switched to systemd as the default init, I was still eager to tinker with the system so I searched an alternative, to finally settle on crux, which has now powered my personnal computers since 2013 !
As I learnt about Linux, package management and writing more C, Crux offered me the simplest package building system I could find, so even if it did not have as many packages in base than the big ones, it was still easy and quick to package them. I could grab a source for an unknown program and package it in 5mn, which was the killer feature for me, even if that means compiling everything (it doesn’t take that much time for a single package, and big updates can be done overnight).
I am slowly moving toward OpenBSD (crux is based upon its ideas), but I am not there yet.
I still dualboot debian steam when I want to play games, and my company laptop runs debian as well, because I need it to “just works”, and not use it as a lab for my ideas :)
How would you compare Crux’s initial setup to something like Arch? I did try Crux before but it was too “from scratch” for my expertise level at the moment.
Also, I get the idea of “just works” for company computers. As soon as I got employed I set up something that let me do my work out of the box without having to spend my whole first day (prob week) at work tweaking my environment.
The install is fairly easy, though I agree it is manual.
You have to generate your locales, configure network, install bootloader, … The handbook is really great for this all.
The hardest part is compiling your own kernel, though after you did this a couple times, it is fairly simple.
To me, the simplicity of the system outweights the manual process drawbacks.
btw, i use arch.
But yeah, I started back in 2010 with Ubuntu and since then I had been hopping distros til’ about 2015 where I settled with Fedora and started contributing to the local community, which was the only active at the moment in my country.
Time had passed by and I started using AntergOS on my personal computers and Fedora/Ubuntu for servers. I liked having bleeding edge software without wasting too much time setting up my machine (as it did a few years ago), plus the documentation and control over packages given was more flexible.
It was this way until I started using Ubuntu at work (for the sake of having something more stable). Like one or two months ago I (finally) noticed I had to do something about my laptop having a discontinued distro so I cleaned my antergos machine and ended up with a “look-like” Arch.
This week a new SSD arrived for my desktop and I installed Arch, I spent a night setting it up and now with the packages I require to work and be comfortable it has more than 1k packages less than the AntergOS installation. I’d like to keep testing more setups if I had a spare machine, but since I’m already familiar with these (Arch, Ubuntu, Fedora) I’d rather not changed my daily drivers for testing purposes like I did with Crux in the middle of college finals.
I mostly do PHP development for work where I use PHPStorm (makes debugging easier) but besides that my setup goes like this:
So regardless of the DE/WM I use at least 3 workspaces in fullscreen for Browser, Alacritty & PHPStorm on a regular day at work.
Also I’m one of those with vim-style keybindings everywhere (except shell) so Firefox, wm (2bwm) and PHPStorm have their Vim mode. Only use mouse for browser.
I’ve started migrating my website from Jekyll to Zola and adding Netlify CMS to make content authoring easier when not on my computer.
Managed to recreate most of the functionality I wanted today but still have some to go to make the static site more “dynamic”.
Couldn’t help but notice how Brad Fitzpatrick’s commits to memcached and go have complementary hours. This guy codes all day(?
Does he still work on memcached? I assumed that was historical…
I don’t believe he actually codes all day long, definitely there was a behavioral change from one to another. For instance, this could also be analyzed in a range of dates which may also be interesting.
Ever since I was a child, I’ve been a night person. I get my best work done between about 11 PM and 3 AM, with maybe an hour or two of variance on either side.
Now I have kids, and they get up early. My wife (bless her) gets up with them in the morning, but they’re still noisy and such and so it’s hard for me to stay asleep.
People told me that I’d adapt and start falling asleep earlier. Nope, just turns out I’m tired all the time.
This reminds me of a post about night owls, here’s the link I’m sure you’ll agree in some aspects The Dawning Truth About Night Owls.
I’m with you on that. Even close on the timing except maybe shift it an hour or two earlier. I would go to sleep by 3am on most night back in the day. My current position has me getting up early in the morning leaving in the afternoon or evening. Keeps my brain in a tired fog most of the time. It was an interesting experiment to see if I’d adapt and to learn some new things. I’ll probably try to switch shifts, positions, or something soon since it sucks so much.
Wow, this is exactly my situation as well. I feel you buddy!
There are some things I want to do this weekend.
Most likely start studying for AWS Associate Solutions Architect Certification
Will have to work on Saturday but I plan to use my Sunday to retake my writing and write some blog posts for next week.
Using Jekyll and GitLab but been considering a cleaner codebase and maybe contribute something. Hugo is the one I’ve been thinking.
Just got a T480, so will be setting it up and maybe go do some canopy outside.
At the moment I’m hosting my IRC client, but the plans are hosting a VPN, some websites and bitwarden as starting point. Been considering Chef to manage configurations
I starteed doing this at work and while I’ve been slower I’ve been able to handle more complex code be it software development or even infrastructure as code. When dealing with AWS lambdas, SNS, S3, EC2.. It becomes necessary not only to know how services interact with each other but to get into the logic inside each of the instances.
If anybody applies (all) these features on its project it’ll start to feel like a work. They’re good indeed, but also adds some complexity for maintainers and people who only wants to contribute.
I look forward improving my skills as a DevOps Engineer by learning Terraform for provisioning and a monitoring stack I’m yet to define.
I’m curious, how many of you are using Mutt as your daily email client at work? How do you cope with calendar invites, frequent HTML emails, …?
I use mutt for personal email, so calendar invites is not an issue for me. I also have mutt use lynx to handle the case when the sender only sent HTML (usually, if there’s an HTML section, there’s also a plain text section). For work, I use whatever they give me—I like keeping a separation between personal and work stuff.
Do you mean invites aren’t an issue because you don’t use them or because you solved this? If so, how?
I read in another comment that it’s just html, and to be fair as I come to think of it, it’s been a long time since I had to care about mutt and calendars, so maybe it was just a dumb link to click through the terminal browser.
I don’t use invites or calendar things via personal email, and if anyone has sent me one, I haven’t noticed.
I did start using mutt at a previous job where I had to chew through a ton of daily mail (basically, all email sent to root on all our various servers were eventually funneled to me) and I found mutt to be much faster than Thunderbird (which should indicate how long ago this was). It was using mutt for a few weeks that prompted me to switch away from elm (which really dates me).
IIRC, when I used mutt regularly, I used to have it pipe html emails straight into elinks to render them inside mutt. Didn’t need calendaring at the time.
I gave up my resistance of modern email quite some time ago; it’s simply too much hassle, personally speaking, dealing with calendaring and rich media content in email to still use a console based MUA, but that being said I really miss the simplicity and lightweight of Mutt.
Mutt was my go-to client for many, many years, and I feel tremendous nostalgia when I am reminded that it’s still actively maintained and indeed has a user base. Bravo. :-)
How many emails do you handle a day? I do about 200, though I need to read or skim all, I only reply to about 1/10th of them… but I can’t imagine keeping up with that in any of the gui clients I have had. With mutt, it feels like nothing.
I’m trying to do more and more with mutt, gradually using the GUI client less. Still haven’t configured a convenient way to view html or attached images but the message editing is nice. I hook it up to vim:
set editor='vim + -c "set ft=mail" -c "set tw=72" -c "set wrap" -c "set spell spelllang=en"'
This mostly formats things correctly, and allows me to touch paragraphs up by hand or with the “gq” command. I can also easily add mail headers such as In-Reply-To if needed. In some ways my graphical client is starting to feel like the constrained one.
I’ve been using Mutt for the past 15+ years for personal email and 5+ years for work - even with Exchange IMAP (special flavour) at one point.
I mostly ignore HTML email - either there’s a text/plain part or HTML->text conversion is good enough - there are occasional issues with superfluous whitespace and it can look a bit ugly when plenty of in-line URLs are being used but these are not that common.
For calendaring I still use web - we’re on G Suite - but am hoping to move to Calcurse at some point (still not sure how to accept invites, though). Bear in mind, calendar != email, and Mutt is an email client - once you accept it, you’ll be much happier :^)
I used it 2015-mid 2017 but ended up moving back to Thunderbird and even web clients. It wasn’t worth the effort. If I didn’t have to handle all my configs to get a decent setup (imap, gpg, multi-account, addresses) then I’d consider using it again. I love the idea of not having to leave my term.
I use mutt daily and have my mailcap set to render html email in lynx/w3m/elinks. It’s sufficient to see if I then need to switch to a GUI mail client. For GUI, I have previously used Thunderbird with DAVmail and currently just use the Outlook client.
I use (neo)mutt as my daily personal email. HTML isn’t an issue, but forwarding attachments and dealing with calendar invites is embarrassing.
Usually I use the bounce feature into my work email (Protonmail), which causes spf-related spam flags to get set, but generally gets the job done.
I self-host my email so the pain threshold is quite high for me to start configuring RoundCube (or whatever the kids today use) or even IMAPS.
PS. not using Google is a bit embarrassing as well, as the email and Nextcloud calendar are so disconnected, but it works better than mutt ;)
This is something you will end up almost on every SOA project. Very difficult to test something without jumping to integrations tests.
We’ll see. Since writing the blog post I’ve had some insights that I’ll share in what will now become a series. I think I can have my cake and eat it too.
GitHub are, of course, a company that thrives from content creators acting as sharecroppers on their centralised hosting platform. The dichotomy of “freedom to post whatever you want to GitHub” vs “OMG the Fahrenheit 451 future of Europe” is a false one, because you can post your open source project’s code to your open source project’s GitLab, Kallithea, or other instance. GitHub are downplaying that alternative so that “freedom” is recast as “the freedom for GitHub to have all your codes”.
Wouldn’t this legislation apply to Gitlab or any other alternative as well?
Wait, my hard drive can store stuff too, now we need to add copyright detection to virus scanner a too!
I can run my own gitlab, I cannot run my own github. If I run my own gitlab then I can know that only my own project code is hosted on the gitlab.
And what, you don’t plan to ever collaborate with anyone? You don’t plan to ever use any open-source libraries written by others? You’re sure you aren’t going to hit any false positives? How do you think Gitlab is being built for your use? Pointing out OP’s self-interest doesn’t actually replace addressing its criticisms.
If this goes through, copyright trolls will become a thing. Get a lawyer, squat on some maximally general pattern of bits, and now projects can’t upload stuff matching it without paying you.
If he sets up public repositories people can contribute code to his repository on his own Gitlab instance.
i run my own gitlab for my software projects, people join there to collaborate or send me patches via email / pastebin.
You got the point here. GitHub is trying to stay in a grey area instead so people won’t move away from their services, “supporting” both freedom and law by passing the ball to us with their Call to Action.
They explicitly mention that for smaller players introduction of content upload filters would be even more burdensome. And also they don’t mention it, it’s obvious that GitHub of all companies would have the resources to implement such a thing. So I don’t see why you try to cast it as GitHub caring only for themselves.
Besides, “listen to what’s being said, not who’s saying”. The concern is valid and well articulated. Any attempt from copyright mongers to tax another human activity is counterproductive to progress and should be stopped.
In some startups, which have small teams, this is a difficult one to tackle as these people tend to have a “bossy” role, share responsabilities and won’t listen.
I feel for you :(