This protocol is an alternative to activitypub/mastodon/fediverse as it gives the user control over their data/social graph instead of it being tied to a particular instance. plus it’s super simple.
I have a few small deployments on NixOS servers, I git push and it does a nixos-rebuild to the new system state. If there is something wrong with the deploy I do a nixos-rebuild switch --rollback. the entire system state is immutable so mutability isn’t a concern. Each deployed app can have a different pinned version of nixpkgs, so other apps on the same machine don’t interfere with others.
For personal projects or green fields projects, I would definitely push for nixos, that I’m using myself for my desktop. Nonetheless, this is harder to get introduced in a more enterprise and regulated environment because of the knowledge being way too different to operate it, without even mentioning that many big-cos want to buy commercial support for their OS.
It’s pretty simple, the derivation takes a nixpkgs argument that defaults to a known working version. example: https://github.com/jb55/gdk_rpc/blob/nix-build/default.nix#L1 it can always be called with any other nixpkgs as well if you don’t want to use the pinned version.
I’m running .netcore on some nixos servers for one of my clients. It works surprisingly well! It’s nice that I don’t need to manage a set of windows servers anymore and just focus on my linux ops/infrastructure.
You say that, and it’s fine for some sites, but a lot of them have anti-adblock scripts baked in alongside the site logic. The only way you’re going to work around that is with redirect rules, like what uBlock Origin does. It also isn’t possible to do annoyance removal, like getting rid of fixed banners, using DNS.
To me, attempting to get blanket web-wide annoyance removal feels like freeloading. That’s not why I block ads. It’s my prerogative to avoid privacy invasion, malware vectors, and resource waste; if the site owner goes to lengths to make it hard to get the content without those, that’s their prerogative, and I just walk away. I’m not going to try to grab something they don’t want me to have. (The upshot is that I don’t necessarily even use an ad-blocker, I simply read most of the web with cookies and Javascript disabled. If a page doesn’t work that way, too bad, I just move on.)
I figure that living in an information desert of my own making is not a very effective form of collective action. There simply aren’t enough ascetics to make it worth an author’s time testing their site with JavaScript turned off. And if it isn’t tested, then it doesn’t work. If even Lobsters, a small-scale social site that you totally could’ve boycotted, can get you to enable JavaScript, then it’s a lost cause. Forget about getting sites with actual captive audiences to do it.
People need to encourage web authors to stop relying on ad networks for their income, and they need to do it without becoming “very intelligent”. An ad blocker that actually works, like uBlock Origin, is the only way I know of to do that; it allows a small number of people (the filter list authors) to sabotage the ad networks at scale, in a targeted way.
Thank you for bringing up Mr. Gotcha on your own initiative, because that sure feels like what you’re doing to me here. “You advocate for browsing with Javascript off. Yet you still turn it on in some places yourself.”
That’s also my objection to the line of argument advanced in the other article you linked: “JavaScript is here. It is not going away. It does good, useful things. It can also do bad, useless, even frustrating things. But that’s the nature of any tool.” I’m sorry, but the good-and-useful Javascript I download daily is measured in kilobytes; the amount of ad-tech Javascript I would be downloading if I didn’t opt out would be measured in at least megabytes. That’s not “just like I can show you a lot of ugly houses”; it inverts the argument to “sure, 99.9% of houses are ugly but pretty ones do exist as well, you know”. Beyond that, it’s a complete misperception of the problem to advocate for “develop[ing] best practices and get[ting] people to learn how to design within the limits”. The problem would not go away if webdevs stopped relying on Javascript, because the problem is not webdevs using Javascript, the problem is ad-tech. (And that, to respond to Mr. Gotcha, is why I enable JS in some places, even if I mostly keep it off.)
In that respect I don’t personally see how “if you insist on shovelling ads at me then I’ll just walk away” is a lesser signal of objection than “then I’ll crowdsource my circumvention to get your content anyway”. But neither seems to me like a signal that’s likely to be received by anyone in practice anyway, and I think you operate under an illusion if you are convinced otherwise. I currently don’t see any particularly effective avenue for collective action in this matter, and I perceive confirmation of that belief in the objectively measurable fact that page weights are inexorably going up despite the age and popularity of the “the web is getting bloated” genre. All webbie/techie people agree that this has to stop, and have been agreeing for years, yet it keeps not happening, and instead keeps getting worse. Maybe because business incentives keep pointing the other way and defectors keep being too few to affect that.
Until and unless that changes, all I can do is find some way of dealing with the situation as it concerns me. And in that respect I find it absurd to have it suggested that I’m placing myself in any sort of “information desert of my own making”. Have you tried doing what I do? You would soon figure out that the web is infinite. Even if I never read another JS-requiring page in my life, there is more of it than I can hope to read in a thousand lifetimes. Nor have I ever missed out on any news that I didn’t get from somewhere else just as well. The JS-enabled web might be a bigger infinity than the non-JS-enabled web (I am not even sure of that, but let’s say it is), but one infinity’s as good as another to this here finite being, thank you.
But neither seems to me like a signal that’s likely to be received by anyone in practice anyway.
I, personally, can handle a script blocker and build my own custom blocking list just fine. I can’t recommend something that complex to people who don’t even really know what JavaScript is, but I can recommend uBlock Origin to almost anyone. They can install it and forget about it, and it makes their browser faster and more secure, while still allowing access to their existing content, because websites are not fungible. Ad networks are huge distributors of malware, and I don’t mean that in the “adtech is malware” sense, I mean it in the “this ad pretends to be an operating system dialog and if you do what it says you’ll install a program that steals your credit card and sells it on the black market.” I find it very easy to convince people to install ad blockers after something like that happens, which it inevitably does if they’re tech-illiterate enough to have not already done something like this themselves.
uBlock Origin is one of the top add-ons in Chrome and Firefox’s stores. Both sites indicated millions of users. Ad blocker usage is estimated to be between 20% in the United States, 30% in Germany, and around that spot in other countries, while the percentage of people who browse without JavaScript is around 1%. I can show you sites with anti-adblock JavaScript, that doesn’t run when JavaScript is turned off entirely and so can be defeated by using NoScript, indicating that they’re more concerned about ad blocker than script blockers. Websites that switched to paywalls cite lack of profitability from ads, caused by a combination of ad blockers and plain-old banner blindness.
Don’t be fatalistic. The current crop of ad networks is not a sustainable business model. It’s a bubble. It will burst, and the ad blockers are really just symptomatic of the fact that noone with any sense trusts the content of a banner ad anyway.
Oh, absolutely. For tech-illiterate relatives for whom I’m effectively their IT support, I don’t tell them to do what I do. Some of them were completely unable to use a computer before tablets with a touchscreen UI come out – and still barely can, like having a hard time even telling text inputs and buttons apart. Expecting them to do what I do would be a complete impossibility.
I run a more complex setup with minimal use of ad blocking myself, because I can, and therefore feel obligated by my knowledge. And to be clear, for the same reason, I would prefer if it were possible for the tech-illiterate people in my life to do what I do – but I know it simply isn’t. So I don’t feel the same way about those people using crowdsourced annoyance removal as I’d feel about using it myself: I’m capable of using the web while protecting myself even without it; they aren’t.
It’s a bubble.
I’m well aware. It’s just proven to be a frustratingly robust one, quelling several smaller external shifts in circumstances that could have served as seeds of its destruction – partly why I’m pessimistic about any attempt at accelerating its demise from the outside. Of course it won’t go on forever, simply because it is a bubble. But it’s looking like it’ll have to play itself out all the way. I hope that’s soon, not least because the longer it goes, the uglier the finale will be.
And of course I would love for reality to prove me overly pessimistic on any of this.
Since I use NixOS, I’ve added a little script to my configuration.nix file which, when I build/upgrade the system, downloads the latest version of these scripts, pulls the source domain out of each entry, and writes an /etc/hosts that sends them all to 127.0.0.1. That way I don’t have to manually keep track of domains, but I also don’t have to worry about phishing, since the worst that can happen is that legitimate URLs (e.g. a bank’s) get redirected to 127.0.0.1 and error-out.
For anyone interested in implementing this without pi-hole, I have a couple scripts on github which might help. I adapted them from the pi-hole project awhile back when I wanted to do something a bit less fully-featured. They can combine multiple upstream lists, and generate configurations for /etc/hosts, dnsmasq, or zone files.
I wasn’t happy with the current state of calendars that require a mouse, I also love timeblocking so I wanted to make a calendar that felt like a text editor and was easy to timeblock with.
He very very briefly mention Jai for future game development if the compiler is released. Looking over the doc he links to about Jai I’m surprised that Myridian hasn’t caught on with game dev (yet).
You make it sound like he’s some fitness guru or something. His self promotion is dozens of hours of video of him explaining decisions behind the languages, and demoing the compiler and a game written in the language.
He’s also paying people (at least 1 ATM) to work on the compiler, with his money from his successful game dev career, while a lot of interesting languages are hobby projects.
I’m automatically skeptical of anyone who prefers to explain things through video rather than in written form - that sets off my guru-dar. IIRC he was a vocal figure who seemed to be deliberately courting controversy when he was pushing the game that made his name?
He mentions in his videos that he’s doing it for fun and as an experiment. I enjoy watching them, I’ve learned a lot about low-level optimization and new ideas in compiler design for gamedev. It’s a different kind of learning experience than just reading a blog post or a book. It’s feels like peer programming with an expert in their field. Watching how they decide to tackle, stumble through, and solve a problem is something you can’t get with other mediums.
Yes. I run a few dozen servers on NixOS, for a contract. A few for personal. I know of at least one group launches 10’s of thousands of servers a month with NixOps and NixOS. I’ve begun introducing Nix (the package manager) for some use at work.
Great! I’m still reading about Disnix to get a better picture what a full setup would look like, with the context of replacing a Chef installation.
Do you use Disnix and/or NixOps?
Can you use local versions of nix packages? Let’s say that for example the released buildGoPackage is on version 1.7, and I want to switch it to 1.8.1 now, can I make that work?
administrating a nixos server is pretty simple, no imperative actions needed, everything is generated from the configuration.nix. It’s also some piece of mind if I ever have to migrate to another machine. I can copy my nix expressions over and have an identical setup.
I subscribed to World of Warcraft for a few years when it was first released. I enjoyed the game for what it was up until I hit the level cap (which was 60 at the time), at which point everyone who I played with joined end-game raiding guilds with strict schedules, seniority hierarchies, loot distribution policies and all kinds of things that I had absolutely no interest in.
Around the same time I discovered that it was sometimes possible to gain access to areas of the map that were officially off limits—usually zones that were part of future planned patches or expansions. Generally this involved exploring every inch of a neighboring zone, and jumping randomly into walls or barriers until you lucked out and found a precise coordinate and landed on it in the precise way that allowed you to clip through into the otherwise inaccessible area of the map.
Doing this became an obsession of mine, and consumed pretty much all my time in the game. It was such a rush to find these little secrets and know that I could be the only one—or at least one of just a few—out of millions of players to explore those creepy, abandoned, often only partially completed areas of the game world. Trying to puzzle out what the zones would eventually be used for, piecing together possible ties to the game lore from different landmarks, and then regaling other people in the game with tales of my excursions and what I had seen was more fun than I ever had actually playing the game and grinding my character’s level up.
Eventually I got perma-banned for trying to glitch my way to GM Island (a special zone normally only accessible by Blizzard employees). But ever since then I’ve always been fascinated by glitches and hidden things in open world games. I wasn’t even aware of this mystery in GTAV. I own it for Xbox One. Might be time to spin it back up and check out this whole Chiliad thing…
I worked as a QA tester for a console game company about 17 years ago, and I did hours upon hours of this. I definitely learned how game geometry / collision detection is a never ending battle for game designers. People will do things you never even consider in order to trigger aberrant behavior in-game. One of my favorite big discoveries was in a 3rd person shooter I got it to crash by blowing up almost every single destructible in the map since it couldn’t allocate enough graphics memory for the texture swaps or something. Who would do that in real gameplay? Hah. It took the developers a while to find that even though I could reproduce it just because it was not at all obvious what triggered it, I was just bored running around hitting walls during testing so passed the time by blowing things up.
More or less that technique works. If you can sorta jam yourself into corners one wall might be able to push you through the other wall (or object). The more you do it, the better you get at spotting possible places where you can clip through the normal geometry.
Yup, they ended up fixing a variation of this. The trick was to just zoom into first person and just keep walking into the wall until you went through. I used to to get some items at C'Thun vendors before beating Twin Emps.
Another fun exploit I used to do is replace the fireplace model with the Dark Portal model, which was huge compared to the small fireplace that you could place anywhere. Since physics is calculated locally, you could use it to climb to places you normally couldn’t get to.
Correct. I mentioned it because I find it’s pretty integral to how I work these days. I have a private git-http-backend/gitweb that I host my internal projects on, that I reference in my nix expressions.
Being able to access all my devices and computers from anywhere makes it easy to work remotely. I run another network at work for our intranet because it’s brain dead simple to set up on Windows, osx, linux, ios, android, etc.
I have to ask, OP: why is your username waffle_ss? Perhaps this is a PC overreaction on my part, but it makes me uncomfortable. (It’s close to Waffen-SS.) If that’s not intentional, then I apologize for bringing it up.
On Moldbug… I don’t find myself much liking the guy, but on the same token, I’m unnerved by the rapidness of the reaction to him. I feel like people are overly focused on Mencius Moldbug and not the question that the organizers have to ask themselves, which is, “Does the presence of Curtis Yarvin make the venue less safe?” The rapid pull-out of conference sponsors seems to be an overreaction.
It’s quite possible that Yarvin has written worse things than what I’ve read, but I haven’t encountered anything that’s made me believe he’s a danger to conference-goers. He also seems to be getting hit with guilt-by-association: many neoreactionaries (NRx) are racist, and he’s an NRx, ergo he’s racist.
We don’t throw out the work of Wagner, Frege, or Ezra Pound even though their politics were disgusting. All of them were much more clearly racist and anti-democratic than Mr. Yarvin, at least by what I’ve seen.
Also, as one who’s been “de-platformed” (not at a conference, but on Quora and Hacker News) as well as effectively fired (Google) for my political views (leftist, anti-racist, anti-sexist, leaning pro-union) I can’t really get up a good feeling when I see it happening to someone else, even if I find his politics to be repulsive.
When I was in middle or high school I played Counter-Strike with my friend; he went by pancake_nazi so I came up with waffle_ss to fit in with the Nazi breakfast theme. Very clever pun, I know. I’ve just been too lazy to think up anything else.
For the record I hold both Nazi and NRx ideologies (well, what I’ve heard of NRx since I don’t have time to read Yarvin’s screeds) in very low regard.
Anyway the name pun is an attempt at levity / poking fun of Nazism, not an endorsement. I do take my waffle-making seriously, though.
Don’t worry. I’m not offended. It’s almost impossible to offend me. I just found it awkward to be replying to someone with that screen name, given the topic.
Neoreaction is, indeed, an ugly philosophy. I find it intellectually interesting, just to scout “the other side”, but there’s only so much I can take before I get depressed. Also, many of his analyses are simplistic or flat-out incorrect. While Yarvin maintains that he isn’t racist, most of the people under the “NRx” tent are. And don’t get me started on so-called “HBD”, which is just nauseating.
On Moldbug… I don’t find myself much liking the guy, but on the same token, I’m unnerved by the rapidness of the reaction to him. I feel like people are overly focused on Mencius Moldbug and not the question that the organizers have to ask themselves, which is, “Does the presence of Curtis Yarvin make the venue less safe?” The rapid pull-out of conference sponsors seems to be an overreaction.
Here’s a different approach, which explains this quite reasonably: LambdaConf made a lot of effort to contact organisations involving PoC, introducing diversity scholarships etc. to gain some fame. Then, suddenly, out of the blue, they decide to run a person which is clearly incompatible. These organisations cut their ties and oppose the project they supported. It’s all very unsurprising. You can’t shout “everyone is equal, please spread!” and then invite someone on the speakers list who wrote hundreds of thousands of words how he thinks people are fundamentally unequal by disposition and some should be slaves.
I’d be far less aggravated if LambdaConf had just been a run-of-the-mill conference, but it tried to be the diverse conference in FP. Now it shows that they actually meant “libertarian”. Appropriating terms like “inclusive” or “diverse” for that is just a recipe for disaster.
I don’t think the word “platform” gets us anywhere. I prefer “spaces” nowadays. LambdaConf chose to be a temporary, short space where anything goes unless it’s not physically violent. What they communicated was something different though. And that difference is biting them now, making sponsors jump off and people protest.
Spaces that are larger and have longer time-spans obviously follow different rules. I would disagree with a ban of Yarvis from Hacker News. I’m not sure how I would feel as an employer, especially as my company does take public stances on diversity issues.
Political issues are nasty and I can see arguments for both of the boundary. It is a boundary though and there will be conflicts around that. Still, I found the protest against LambdaConf appropriate and many people made the effort to also read the statements of LambdaConf critically and it is also okay to approach sponsors. People even opened up a competing conf. Especially if you hold libertarian views, these should be very valid forms of protest.
Excluding people from spaces is something that should rarely be done, but something that will become an issue over time. Especially when people are fundamentally incompatible with others (what’s compatible or not for the organiser to decide). I’m moderating Bulletin Boards and running Meetups and Conferences for 15 years now and usually subscribe to “have the ban-hammer in the corner, but keep it visible” as an approach there. Erring on the side of not kicking people out is also important, but if you end up spending hours and hours writing high-level things of meager philosophical value, something is broken and you have probably fucked up. In my book, Moldbug though would be uninvitable after I contacted the first organisation supporting PoC (which often had actual slaves as ancestors) for support.
I for one note that people suddenly feel like writing thousands of words about all these topics after the fact. It would have been nice if LambdaConf had, for example, spent the equal amount of time on writing on how to include disabled people, PoC and other marginalised groups. But, here’s the catch: even if this whole thing hadn’t happened, they wouldn’t have. And that’s a lot food for thought about how inclusive they really are.
LambdaConf made a lot of effort to contact organisations involving PoC, introducing diversity scholarships etc. to gain some fame.
The devil of it is, another reading is that those same groups were incredibly fickle.
I hate to be cynical, but it seems that a reasonable conclusion for the majority would be to continue business as usual: if you do, nothing changes other than token kvetching online, and if you don’t you may well end up with a huge PR disaster on your hands.
This whole debacle can be interpreted–in the souless business sense–as a big message that trying to include potentially sensitive groups can backfire tremendously.
I am in the process of trying out emacs + evil after being a long time vim user. I started with spacemacs but quickly found it to be pretty confusing to setup and switched to just straight emacs + evil and haven’t had any issues. I have the same feeling for all the vim starter kits as well, they do too much and people don’t understand whats going on. I find immense value in setting up my environment from scratch and learning about the different pieces and how they work together. Yes spacemacs adds layers and some other configuration on top but I found it to be way more heavy handed and confusing than I needed.
I created the original Starter Kit for Emacs, and I fully agree. Back in the day (before the package manager) it sorta made sense, but these days the effort would is much better spent on creating and documenting individual packages that do one thing well. The Emacs Starter Kit is now fully deprecated, and the readme is just a document explaining why it was a bad idea.
I’m also fine with emacs+evil so far. One hangup is that evil-mode interacts badly with some other packages and modes, though. For example, I use mu4e to read mail, and evil-mode breaks its main menu. There are usually workarounds, but if you use a lot of those modes together, an all-in-one setup where someone has already done the configuration to get everything working together might save time and hassle.
Yeah thats a good point, to be fair I haven’t gotten far enough into spacemacs or emacs for that matter to experience many weird interactions. I have seem some weird behavior between evil and helm (I think) and some other modes (opening git interactive rebase seems to completely disable evil mode). I was going to give emacs+evil a few weeks and re-evaluate. If I end up switching back I will miss https://github.com/johanvts/emacs-fireplace though :)
The most useful thing is that the layers provide consistent evil bindings. They also deal with a lot the quirks when integrating evil modes into holy things. Recreating that would be a lot of work.
Yeah I’ve definitely noticed some of those quirks and don’t have a great way to figure out what they are and how to fix them. That’s definitely where spacemacs would come in but honestly it feels like an uphill battle of whack-a-mole.
I’m working on a programming game for kids using terra + bgfx + nanovg. Its still in its early engine development stages. To actually finish it I’ve recently discovered the calendar timeblocking technique, which has been an incredible tool to fight procrastination. It’s also fun to write native code again since most of my day job is working at a higher level. It also gave me a reason to try out terra, which is an interesting language that uses lua to metaprogram native code at compile time.
I’m becoming more and more tired of my github timeline being clogged by xxx starred yyy. If I got the chance, I’ll try to fix this with a browser extension this week.
[edit] Stop upvoting you’re stressing me dammit!
[edit2] Ok, here’s a poc (look ma, no jquery). Replace the settings and copy-paste it in chrome’s console after loading https://github.com. Took me 10min, I’ll package it into a chrome extension after work. If you know how FF extensions work and want to contribute send me a message.
This is interesting, I find myself wanting to filter just the opposite from my feed. I use it as a discovery tool for new projects, anything else I view in the notifications page anyway.
Over the last two years my repos have been starred 3 times an hour in average. I really don’t care about these star notifications. I’m much more interested in the discovery aspect you describe, I’d like to know what people I follow are starring for example!
I’ll ping you as soon as my code is online to ask about your specific use case and see how we can add it.
It’s definitely possible that self driving is achievable with simple neural networks, but I feel it begs the quote from Andrew Ng:
One thing about speech recognition: most people don’t understand the difference between 95 and 99 percent accurate. Ninety-five percent means you get one-in-20 words wrong. That’s just annoying, it’s painful to go back and correct it on your cell phone.
Ninety-nine percent is game changing. If there’s 99 percent, it becomes reliable. It just works and you use it all the time. So this is not just a four percent incremental improvement, this is the difference between people rarely using it and people using it all the time
I have a feeling comma.ai is still in the 95 percent phase or much less. This probably isn’t even close to what Google or Baidu is building, not to mention they have a teams of AI experts who have been working with these algorithms for much longer.
It’s super impressive that he’s got this far by himself though. Should be interesting to see what he does next.
I think the convergence is happening and I’m largely unhappy with it so far.
I’m not necessarily unhappy with the idea of all of these things together, but I’m unhappy that each language is doing it themselves and doing it poorly. Maven is a nightmare of pain. PIP is stupid. Gems are hairballs. Cabal is hell. Rebar is just crap. Go’s package manager is a joke.
If one does any polyglot development (microservies make that ok, right?), one has to learn a bunch of tools that are all pretty broken in unique and terrifying ways. On top of that, it’s not possible to depend on the output of another language, for example writing a program that runs another program. Part of the problem is up top, every OS has its own package manager and development is entirely language driven, single people rarely write more than one language, and they don’t want to learn different package managers for everywhere they might deploy it.
So if we’re going to breakdown these walls, can we at least do it right, and solve it in a language-agnostic way? The development tools I’ve build are really just translations to Makefiles. They take a project as input and give a Makefile that can build it and composes with other Makefiles. Makefiles are not perfect and I’d be happy to produce something else, but Makefiles are the best I have seen, so far. The value is that if I have N langauges, and all of them produce Makefiles, I can build the whole thing together and it Just Works.
Go doesn’t have a package manager. They punted on it completely since it wasn’t in their problem domain. They have an abstraction over VCS which is convenient but not a package manager. This was probably smart on their part.
Go the community has put out a few package managers but they haven’t really settled on one that wins yet. I think because in part the build everything from tip worked remarkably well for a long time. For most of my projects it still works.
The problem with a language solving the whole package management thing is that almost no project or app is a single language application anymore. Which means that in the context of an application what you really need is a package manager that understands javascript, go, python, … And when you widen the scope to systems it gets even more crazy.
At some point you end up with a package manager that is overlaying the language package managers and trying to resolve the dependency graphs between the ruby gem and the npm packages. And neither rubyhems nor npm really knows anything about the others dependency graphs.
This same principle holds for build systems as well. For an application you need to build both javascript and {Go,Ruby,Python,…}. Indeed in some sense in the Ruby and Python case the “build” and the “packaging” are the same thing sense they aren’t compiled.
This is a case of ambiguous terminology. The package here is the one defined by the package keyword. Which is not at all the same thing as python’s pip, Ruby’s gems, or Nodes npm. It’s not a manager it’s a convenient fetcher if you happen to have stored your go package’s source at the correct location and named your import path correctly. But it is not a package manager in the sense that this article uses the term.
You say tomato I say tomaaaato. I don’t really see the difference in what this article talks about and go get. go get downloads and builds dependencies and builds them. This is a primary function of most package managers. It happens to be really terrible at the job, but it’s still solving the same function.
It’s really good at downloading and building the dependencies of a package. What it isn’t good at and in fact doesn’t do are versioning and resolving dependency conflicts which is I think the piece that makes a package manager useful for most people. Otherwise all package managers would just be a thin veneer over wget and the compiler.
Go get is a thin veneer over wget and the compiler and I believe was specifically made to be no more than that. i.e. the go developers didn’t want to solve the package manager problem with go get. They just wanted to make it easy to get and compile head for a Go package in the language spec sense of the term.
I already used it to replace most of the package managers you’ve listed on my system. I’m also looking into using it to replace npm via https://github.com/adnelson/nixfromnpm (generating flat dependency trees instead of the insane thing npm does)
Not to mention the Haskell infrastructure is kickass, and way beyond what cabal and stack can do.
For people who use nix, this is already a solved problem! Just need more manpower to bring all of the package sets up to the quality of the Haskell ones.
Yep Nix is great, although it is not a build system, it at least makes it easy to call out to anything that builds. I really need to work on getting Nix to work on FreeBSD again :(
working on my iOS/web nostr clients:
https://damus.io
https://damus.io/web/?pk=jb55@jb55.com
This protocol is an alternative to activitypub/mastodon/fediverse as it gives the user control over their data/social graph instead of it being tied to a particular instance. plus it’s super simple.
I have a few small deployments on NixOS servers, I git push and it does a nixos-rebuild to the new system state. If there is something wrong with the deploy I do a
nixos-rebuild switch --rollback
. the entire system state is immutable so mutability isn’t a concern. Each deployed app can have a different pinned version of nixpkgs, so other apps on the same machine don’t interfere with others.For personal projects or green fields projects, I would definitely push for nixos, that I’m using myself for my desktop. Nonetheless, this is harder to get introduced in a more enterprise and regulated environment because of the knowledge being way too different to operate it, without even mentioning that many big-cos want to buy commercial support for their OS.
How do you approach nixpkgs pinning? I’ve thought about keeping a whole channel in our repo, but curious to hear your approach.
It’s pretty simple, the derivation takes a nixpkgs argument that defaults to a known working version. example: https://github.com/jb55/gdk_rpc/blob/nix-build/default.nix#L1 it can always be called with any other nixpkgs as well if you don’t want to use the pinned version.
Neat, so you just pin your build to the hash in git. Sounds reproducible to me!
I’m running .netcore on some nixos servers for one of my clients. It works surprisingly well! It’s nice that I don’t need to manage a set of windows servers anymore and just focus on my linux ops/infrastructure.
I use black at home in the dark and white when I’m on my laptop outside in brighter environments.
I have script that switches between the two: https://www.youtube.com/watch?v=UEM_v8oVvS0
These days I just use dns-based blocklists. Just run dnsmasq with an adblock blocklist locally, or on your home network with a raspberry pi.
You say that, and it’s fine for some sites, but a lot of them have anti-adblock scripts baked in alongside the site logic. The only way you’re going to work around that is with redirect rules, like what uBlock Origin does. It also isn’t possible to do annoyance removal, like getting rid of fixed banners, using DNS.
For the sites that it doesn’t work for, I close the tab and move on. It wasn’t worth my time anyway.
To me, attempting to get blanket web-wide annoyance removal feels like freeloading. That’s not why I block ads. It’s my prerogative to avoid privacy invasion, malware vectors, and resource waste; if the site owner goes to lengths to make it hard to get the content without those, that’s their prerogative, and I just walk away. I’m not going to try to grab something they don’t want me to have. (The upshot is that I don’t necessarily even use an ad-blocker, I simply read most of the web with cookies and Javascript disabled. If a page doesn’t work that way, too bad, I just move on.)
I figure that living in an information desert of my own making is not a very effective form of collective action. There simply aren’t enough ascetics to make it worth an author’s time testing their site with JavaScript turned off. And if it isn’t tested, then it doesn’t work. If even Lobsters, a small-scale social site that you totally could’ve boycotted, can get you to enable JavaScript, then it’s a lost cause. Forget about getting sites with actual captive audiences to do it.
People need to encourage web authors to stop relying on ad networks for their income, and they need to do it without becoming “very intelligent”. An ad blocker that actually works, like uBlock Origin, is the only way I know of to do that; it allows a small number of people (the filter list authors) to sabotage the ad networks at scale, in a targeted way.
Thank you for bringing up Mr. Gotcha on your own initiative, because that sure feels like what you’re doing to me here. “You advocate for browsing with Javascript off. Yet you still turn it on in some places yourself.”
That’s also my objection to the line of argument advanced in the other article you linked: “JavaScript is here. It is not going away. It does good, useful things. It can also do bad, useless, even frustrating things. But that’s the nature of any tool.” I’m sorry, but the good-and-useful Javascript I download daily is measured in kilobytes; the amount of ad-tech Javascript I would be downloading if I didn’t opt out would be measured in at least megabytes. That’s not “just like I can show you a lot of ugly houses”; it inverts the argument to “sure, 99.9% of houses are ugly but pretty ones do exist as well, you know”. Beyond that, it’s a complete misperception of the problem to advocate for “develop[ing] best practices and get[ting] people to learn how to design within the limits”. The problem would not go away if webdevs stopped relying on Javascript, because the problem is not webdevs using Javascript, the problem is ad-tech. (And that, to respond to Mr. Gotcha, is why I enable JS in some places, even if I mostly keep it off.)
In that respect I don’t personally see how “if you insist on shovelling ads at me then I’ll just walk away” is a lesser signal of objection than “then I’ll crowdsource my circumvention to get your content anyway”. But neither seems to me like a signal that’s likely to be received by anyone in practice anyway, and I think you operate under an illusion if you are convinced otherwise. I currently don’t see any particularly effective avenue for collective action in this matter, and I perceive confirmation of that belief in the objectively measurable fact that page weights are inexorably going up despite the age and popularity of the “the web is getting bloated” genre. All webbie/techie people agree that this has to stop, and have been agreeing for years, yet it keeps not happening, and instead keeps getting worse. Maybe because business incentives keep pointing the other way and defectors keep being too few to affect that.
Until and unless that changes, all I can do is find some way of dealing with the situation as it concerns me. And in that respect I find it absurd to have it suggested that I’m placing myself in any sort of “information desert of my own making”. Have you tried doing what I do? You would soon figure out that the web is infinite. Even if I never read another JS-requiring page in my life, there is more of it than I can hope to read in a thousand lifetimes. Nor have I ever missed out on any news that I didn’t get from somewhere else just as well. The JS-enabled web might be a bigger infinity than the non-JS-enabled web (I am not even sure of that, but let’s say it is), but one infinity’s as good as another to this here finite being, thank you.
I, personally, can handle a script blocker and build my own custom blocking list just fine. I can’t recommend something that complex to people who don’t even really know what JavaScript is, but I can recommend uBlock Origin to almost anyone. They can install it and forget about it, and it makes their browser faster and more secure, while still allowing access to their existing content, because websites are not fungible. Ad networks are huge distributors of malware, and I don’t mean that in the “adtech is malware” sense, I mean it in the “this ad pretends to be an operating system dialog and if you do what it says you’ll install a program that steals your credit card and sells it on the black market.” I find it very easy to convince people to install ad blockers after something like that happens, which it inevitably does if they’re tech-illiterate enough to have not already done something like this themselves.
uBlock Origin is one of the top add-ons in Chrome and Firefox’s stores. Both sites indicated millions of users. Ad blocker usage is estimated to be between 20% in the United States, 30% in Germany, and around that spot in other countries, while the percentage of people who browse without JavaScript is around 1%. I can show you sites with anti-adblock JavaScript, that doesn’t run when JavaScript is turned off entirely and so can be defeated by using NoScript, indicating that they’re more concerned about ad blocker than script blockers. Websites that switched to paywalls cite lack of profitability from ads, caused by a combination of ad blockers and plain-old banner blindness.
Don’t be fatalistic. The current crop of ad networks is not a sustainable business model. It’s a bubble. It will burst, and the ad blockers are really just symptomatic of the fact that noone with any sense trusts the content of a banner ad anyway.
Oh, absolutely. For tech-illiterate relatives for whom I’m effectively their IT support, I don’t tell them to do what I do. Some of them were completely unable to use a computer before tablets with a touchscreen UI come out – and still barely can, like having a hard time even telling text inputs and buttons apart. Expecting them to do what I do would be a complete impossibility.
I run a more complex setup with minimal use of ad blocking myself, because I can, and therefore feel obligated by my knowledge. And to be clear, for the same reason, I would prefer if it were possible for the tech-illiterate people in my life to do what I do – but I know it simply isn’t. So I don’t feel the same way about those people using crowdsourced annoyance removal as I’d feel about using it myself: I’m capable of using the web while protecting myself even without it; they aren’t.
I’m well aware. It’s just proven to be a frustratingly robust one, quelling several smaller external shifts in circumstances that could have served as seeds of its destruction – partly why I’m pessimistic about any attempt at accelerating its demise from the outside. Of course it won’t go on forever, simply because it is a bubble. But it’s looking like it’ll have to play itself out all the way. I hope that’s soon, not least because the longer it goes, the uglier the finale will be.
And of course I would love for reality to prove me overly pessimistic on any of this.
I use
/etc/hosts
as a block list, but it’s a constant arms race with new domains popping up. I use block lists like http://someonewhocares.org/hosts/hosts and https://www.remembertheusers.com/files/hosts-fb but I don’t want to blindly trust such third-parties to redirect arbitrary domains in arbitrary ways.Since I use NixOS, I’ve added a little script to my configuration.nix file which, when I build/upgrade the system, downloads the latest version of these scripts, pulls the source domain out of each entry, and writes an
/etc/hosts
that sends them all to127.0.0.1
. That way I don’t have to manually keep track of domains, but I also don’t have to worry about phishing, since the worst that can happen is that legitimate URLs (e.g. a bank’s) get redirected to127.0.0.1
and error-out.For anyone interested in implementing this without pi-hole, I have a couple scripts on github which might help. I adapted them from the pi-hole project awhile back when I wanted to do something a bit less fully-featured. They can combine multiple upstream lists, and generate configurations for
/etc/hosts
,dnsmasq
, or zone files.client: notmuch, emacs, muchsync, msmtp (for sending email via gmail or my personal server)
server: https://gitlab.com/simple-nixos-mailserver/nixos-mailserver
Working on viscal a vim-like timeblocking calendar:
I wasn’t happy with the current state of calendars that require a mouse, I also love timeblocking so I wanted to make a calendar that felt like a text editor and was easy to timeblock with.
https://github.com/maandree/yes-silly is much faster
He very very briefly mention Jai for future game development if the compiler is released. Looking over the doc he links to about Jai I’m surprised that Myridian hasn’t caught on with game dev (yet).
I can’t see the link, but Jai is the creation of a vocal self-promoter; I would treat anything advertising it with a fair chunk of skepticism.
You make it sound like he’s some fitness guru or something. His self promotion is dozens of hours of video of him explaining decisions behind the languages, and demoing the compiler and a game written in the language.
He’s also paying people (at least 1 ATM) to work on the compiler, with his money from his successful game dev career, while a lot of interesting languages are hobby projects.
I’m automatically skeptical of anyone who prefers to explain things through video rather than in written form - that sets off my guru-dar. IIRC he was a vocal figure who seemed to be deliberately courting controversy when he was pushing the game that made his name?
He mentions in his videos that he’s doing it for fun and as an experiment. I enjoy watching them, I’ve learned a lot about low-level optimization and new ideas in compiler design for gamedev. It’s a different kind of learning experience than just reading a blog post or a book. It’s feels like peer programming with an expert in their field. Watching how they decide to tackle, stumble through, and solve a problem is something you can’t get with other mediums.
to be fair, Rust and Go are not without a cadre of aggressive vocal promoters.
Anyone here who has some experience using Nix (either the full OS or only the package manager on a more conventional Linux) in a server environment?
Yes. I run a few dozen servers on NixOS, for a contract. A few for personal. I know of at least one group launches 10’s of thousands of servers a month with NixOps and NixOS. I’ve begun introducing Nix (the package manager) for some use at work.
I’d be happy to answer your questions!
Great! I’m still reading about Disnix to get a better picture what a full setup would look like, with the context of replacing a Chef installation.
Do you use Disnix and/or NixOps?
Can you use local versions of nix packages? Let’s say that for example the released buildGoPackage is on version 1.7, and I want to switch it to 1.8.1 now, can I make that work?
I use NixOps and not Disnix. I think Disnix is less well used compared to NixOps.
You bet. You can mix and match Stable and Unstable, too.
You could, however you would need to make the patch for that to work. I’m sure #NixOS on Freenode would be thrilled to help out.
I run my personal server (jb55.com) with nixos on linode.
It hosts:
This is what the config looks like: https://github.com/jb55/nix-files/blob/charon/machines/charon/default.nix
which is layered on the default config for all my machines: https://github.com/jb55/nix-files/blob/charon/configuration.nix
administrating a nixos server is pretty simple, no imperative actions needed, everything is generated from the configuration.nix. It’s also some piece of mind if I ever have to migrate to another machine. I can copy my nix expressions over and have an identical setup.
Thanks, that is actually helpful.
I subscribed to World of Warcraft for a few years when it was first released. I enjoyed the game for what it was up until I hit the level cap (which was 60 at the time), at which point everyone who I played with joined end-game raiding guilds with strict schedules, seniority hierarchies, loot distribution policies and all kinds of things that I had absolutely no interest in.
Around the same time I discovered that it was sometimes possible to gain access to areas of the map that were officially off limits—usually zones that were part of future planned patches or expansions. Generally this involved exploring every inch of a neighboring zone, and jumping randomly into walls or barriers until you lucked out and found a precise coordinate and landed on it in the precise way that allowed you to clip through into the otherwise inaccessible area of the map.
Doing this became an obsession of mine, and consumed pretty much all my time in the game. It was such a rush to find these little secrets and know that I could be the only one—or at least one of just a few—out of millions of players to explore those creepy, abandoned, often only partially completed areas of the game world. Trying to puzzle out what the zones would eventually be used for, piecing together possible ties to the game lore from different landmarks, and then regaling other people in the game with tales of my excursions and what I had seen was more fun than I ever had actually playing the game and grinding my character’s level up.
Eventually I got perma-banned for trying to glitch my way to GM Island (a special zone normally only accessible by Blizzard employees). But ever since then I’ve always been fascinated by glitches and hidden things in open world games. I wasn’t even aware of this mystery in GTAV. I own it for Xbox One. Might be time to spin it back up and check out this whole Chiliad thing…
I worked as a QA tester for a console game company about 17 years ago, and I did hours upon hours of this. I definitely learned how game geometry / collision detection is a never ending battle for game designers. People will do things you never even consider in order to trigger aberrant behavior in-game. One of my favorite big discoveries was in a 3rd person shooter I got it to crash by blowing up almost every single destructible in the map since it couldn’t allocate enough graphics memory for the texture swaps or something. Who would do that in real gameplay? Hah. It took the developers a while to find that even though I could reproduce it just because it was not at all obvious what triggered it, I was just bored running around hitting walls during testing so passed the time by blowing things up.
How did you find these glitches? Do you spend hours holding down the “jump” key and moving inch by inch along every wall until something happens?
More or less that technique works. If you can sorta jam yourself into corners one wall might be able to push you through the other wall (or object). The more you do it, the better you get at spotting possible places where you can clip through the normal geometry.
Yup, they ended up fixing a variation of this. The trick was to just zoom into first person and just keep walking into the wall until you went through. I used to to get some items at C'Thun vendors before beating Twin Emps.
Another fun exploit I used to do is replace the fireplace model with the Dark Portal model, which was huge compared to the small fireplace that you could place anywhere. Since physics is calculated locally, you could use it to climb to places you normally couldn’t get to.
Realforce 104UB with ergonomic key weighting. Topre key switches are a joy to type on and don’t annoy my coworkers.
Just got my Realforce 87U with ergonomic weighting a month ago. Still getting used to it.
Before I was using a DEC LK411 with PS/2 connector. Proper curved backplane, proper spherical keycaps, nice enough layout, cheapo “switches”.
I found it a bit awkward until I got these wrist rests: https://www.massdrop.com/buy/16002
Can’t see it… But yes, wrist rests are needed for a keyboard like this. I was thinking about getting some.
Edit: looks like mass drop links are broken? They are called the npkc wooden wrist rests
This link works for me.
I was thinking of getting a Filco wooden wrist rest which has a slightly different profile.
Spacemacs + Haskell + nixos + xmonad + zerotier
zerotier as in ZeroTierOne SDN?
Correct. I mentioned it because I find it’s pretty integral to how I work these days. I have a private git-http-backend/gitweb that I host my internal projects on, that I reference in my nix expressions.
Being able to access all my devices and computers from anywhere makes it easy to work remotely. I run another network at work for our intranet because it’s brain dead simple to set up on Windows, osx, linux, ios, android, etc.
I have to ask, OP: why is your username waffle_ss? Perhaps this is a PC overreaction on my part, but it makes me uncomfortable. (It’s close to Waffen-SS.) If that’s not intentional, then I apologize for bringing it up.
On Moldbug… I don’t find myself much liking the guy, but on the same token, I’m unnerved by the rapidness of the reaction to him. I feel like people are overly focused on Mencius Moldbug and not the question that the organizers have to ask themselves, which is, “Does the presence of Curtis Yarvin make the venue less safe?” The rapid pull-out of conference sponsors seems to be an overreaction.
It’s quite possible that Yarvin has written worse things than what I’ve read, but I haven’t encountered anything that’s made me believe he’s a danger to conference-goers. He also seems to be getting hit with guilt-by-association: many neoreactionaries (NRx) are racist, and he’s an NRx, ergo he’s racist.
We don’t throw out the work of Wagner, Frege, or Ezra Pound even though their politics were disgusting. All of them were much more clearly racist and anti-democratic than Mr. Yarvin, at least by what I’ve seen.
Also, as one who’s been “de-platformed” (not at a conference, but on Quora and Hacker News) as well as effectively fired (Google) for my political views (leftist, anti-racist, anti-sexist, leaning pro-union) I can’t really get up a good feeling when I see it happening to someone else, even if I find his politics to be repulsive.
When I was in middle or high school I played Counter-Strike with my friend; he went by
pancake_nazi
so I came up withwaffle_ss
to fit in with the Nazi breakfast theme. Very clever pun, I know. I’ve just been too lazy to think up anything else.For the record I hold both Nazi and NRx ideologies (well, what I’ve heard of NRx since I don’t have time to read Yarvin’s screeds) in very low regard.
Anyway the name pun is an attempt at levity / poking fun of Nazism, not an endorsement. I do take my waffle-making seriously, though.
Garden path sentence much?
I learn from the best
https://www.youtube.com/watch?v=oPpzJAzdpTU
Don’t worry. I’m not offended. It’s almost impossible to offend me. I just found it awkward to be replying to someone with that screen name, given the topic.
Neoreaction is, indeed, an ugly philosophy. I find it intellectually interesting, just to scout “the other side”, but there’s only so much I can take before I get depressed. Also, many of his analyses are simplistic or flat-out incorrect. While Yarvin maintains that he isn’t racist, most of the people under the “NRx” tent are. And don’t get me started on so-called “HBD”, which is just nauseating.
Here’s a different approach, which explains this quite reasonably: LambdaConf made a lot of effort to contact organisations involving PoC, introducing diversity scholarships etc. to gain some fame. Then, suddenly, out of the blue, they decide to run a person which is clearly incompatible. These organisations cut their ties and oppose the project they supported. It’s all very unsurprising. You can’t shout “everyone is equal, please spread!” and then invite someone on the speakers list who wrote hundreds of thousands of words how he thinks people are fundamentally unequal by disposition and some should be slaves.
I’d be far less aggravated if LambdaConf had just been a run-of-the-mill conference, but it tried to be the diverse conference in FP. Now it shows that they actually meant “libertarian”. Appropriating terms like “inclusive” or “diverse” for that is just a recipe for disaster.
I don’t think the word “platform” gets us anywhere. I prefer “spaces” nowadays. LambdaConf chose to be a temporary, short space where anything goes unless it’s not physically violent. What they communicated was something different though. And that difference is biting them now, making sponsors jump off and people protest.
Spaces that are larger and have longer time-spans obviously follow different rules. I would disagree with a ban of Yarvis from Hacker News. I’m not sure how I would feel as an employer, especially as my company does take public stances on diversity issues.
Political issues are nasty and I can see arguments for both of the boundary. It is a boundary though and there will be conflicts around that. Still, I found the protest against LambdaConf appropriate and many people made the effort to also read the statements of LambdaConf critically and it is also okay to approach sponsors. People even opened up a competing conf. Especially if you hold libertarian views, these should be very valid forms of protest.
Excluding people from spaces is something that should rarely be done, but something that will become an issue over time. Especially when people are fundamentally incompatible with others (what’s compatible or not for the organiser to decide). I’m moderating Bulletin Boards and running Meetups and Conferences for 15 years now and usually subscribe to “have the ban-hammer in the corner, but keep it visible” as an approach there. Erring on the side of not kicking people out is also important, but if you end up spending hours and hours writing high-level things of meager philosophical value, something is broken and you have probably fucked up. In my book, Moldbug though would be uninvitable after I contacted the first organisation supporting PoC (which often had actual slaves as ancestors) for support.
I for one note that people suddenly feel like writing thousands of words about all these topics after the fact. It would have been nice if LambdaConf had, for example, spent the equal amount of time on writing on how to include disabled people, PoC and other marginalised groups. But, here’s the catch: even if this whole thing hadn’t happened, they wouldn’t have. And that’s a lot food for thought about how inclusive they really are.
Man all I want to do is see his talk on Urbit. I’m just glad I don’t have to goto an NRx rally to do so.
The devil of it is, another reading is that those same groups were incredibly fickle.
I hate to be cynical, but it seems that a reasonable conclusion for the majority would be to continue business as usual: if you do, nothing changes other than token kvetching online, and if you don’t you may well end up with a huge PR disaster on your hands.
This whole debacle can be interpreted–in the souless business sense–as a big message that trying to include potentially sensitive groups can backfire tremendously.
I am in the process of trying out emacs + evil after being a long time vim user. I started with spacemacs but quickly found it to be pretty confusing to setup and switched to just straight emacs + evil and haven’t had any issues. I have the same feeling for all the vim starter kits as well, they do too much and people don’t understand whats going on. I find immense value in setting up my environment from scratch and learning about the different pieces and how they work together. Yes spacemacs adds layers and some other configuration on top but I found it to be way more heavy handed and confusing than I needed.
I created the original Starter Kit for Emacs, and I fully agree. Back in the day (before the package manager) it sorta made sense, but these days the effort would is much better spent on creating and documenting individual packages that do one thing well. The Emacs Starter Kit is now fully deprecated, and the readme is just a document explaining why it was a bad idea.
I’m also fine with emacs+evil so far. One hangup is that evil-mode interacts badly with some other packages and modes, though. For example, I use mu4e to read mail, and evil-mode breaks its main menu. There are usually workarounds, but if you use a lot of those modes together, an all-in-one setup where someone has already done the configuration to get everything working together might save time and hassle.
Yeah thats a good point, to be fair I haven’t gotten far enough into spacemacs or emacs for that matter to experience many weird interactions. I have seem some weird behavior between evil and helm (I think) and some other modes (opening git interactive rebase seems to completely disable evil mode). I was going to give emacs+evil a few weeks and re-evaluate. If I end up switching back I will miss https://github.com/johanvts/emacs-fireplace though :)
The most useful thing is that the layers provide consistent evil bindings. They also deal with a lot the quirks when integrating evil modes into holy things. Recreating that would be a lot of work.
Yeah I’ve definitely noticed some of those quirks and don’t have a great way to figure out what they are and how to fix them. That’s definitely where spacemacs would come in but honestly it feels like an uphill battle of whack-a-mole.
I had the same experience.
I’m working on a programming game for kids using terra + bgfx + nanovg. Its still in its early engine development stages. To actually finish it I’ve recently discovered the calendar timeblocking technique, which has been an incredible tool to fight procrastination. It’s also fun to write native code again since most of my day job is working at a higher level. It also gave me a reason to try out terra, which is an interesting language that uses lua to metaprogram native code at compile time.
I’m becoming more and more tired of my github timeline being clogged by
xxx starred yyy
. If I got the chance, I’ll try to fix this with a browser extension this week.[edit] Stop upvoting you’re stressing me dammit!
[edit2] Ok, here’s a poc (look ma, no jquery). Replace the settings and copy-paste it in chrome’s console after loading https://github.com. Took me 10min, I’ll package it into a chrome extension after work. If you know how FF extensions work and want to contribute send me a message.
This is interesting, I find myself wanting to filter just the opposite from my feed. I use it as a discovery tool for new projects, anything else I view in the notifications page anyway.
Over the last two years my repos have been starred 3 times an hour in average. I really don’t care about these star notifications. I’m much more interested in the discovery aspect you describe, I’d like to know what people I follow are starring for example!
I’ll ping you as soon as my code is online to ask about your specific use case and see how we can add it.
Ah I see what you mean now. That’s all I would like to see: people starring other projects and nothing else.
Here it is, in all its beta glory: https://github.com/vhf/gh-feed-filter Relevant extension code is in here
It’s far from perfect but at least it’s an effective noise-canceling filter. Should you try it, don’t hesitate to open issues or PRs.
It’s up in the webstore: https://chrome.google.com/webstore/detail/github-feed-blacklist/dbhboodpldcdeolligbmnhnjpkkolcnl?hl=en
It looks like he was teaching himself MNIST autoencoders 5 months ago: https://github.com/geohot/nnweights
It’s definitely possible that self driving is achievable with simple neural networks, but I feel it begs the quote from Andrew Ng:
I have a feeling comma.ai is still in the 95 percent phase or much less. This probably isn’t even close to what Google or Baidu is building, not to mention they have a teams of AI experts who have been working with these algorithms for much longer.
It’s super impressive that he’s got this far by himself though. Should be interesting to see what he does next.
Looks like Tesla just released a similar statement: https://www.teslamotors.com/support/correction-article-first-person-hack-iphone-built-self-driving-car
I think the convergence is happening and I’m largely unhappy with it so far.
I’m not necessarily unhappy with the idea of all of these things together, but I’m unhappy that each language is doing it themselves and doing it poorly. Maven is a nightmare of pain. PIP is stupid. Gems are hairballs. Cabal is hell. Rebar is just crap. Go’s package manager is a joke.
If one does any polyglot development (microservies make that ok, right?), one has to learn a bunch of tools that are all pretty broken in unique and terrifying ways. On top of that, it’s not possible to depend on the output of another language, for example writing a program that runs another program. Part of the problem is up top, every OS has its own package manager and development is entirely language driven, single people rarely write more than one language, and they don’t want to learn different package managers for everywhere they might deploy it.
So if we’re going to breakdown these walls, can we at least do it right, and solve it in a language-agnostic way? The development tools I’ve build are really just translations to Makefiles. They take a project as input and give a Makefile that can build it and composes with other Makefiles. Makefiles are not perfect and I’d be happy to produce something else, but Makefiles are the best I have seen, so far. The value is that if I have N langauges, and all of them produce Makefiles, I can build the whole thing together and it Just Works.
Go doesn’t have a package manager. They punted on it completely since it wasn’t in their problem domain. They have an abstraction over VCS which is convenient but not a package manager. This was probably smart on their part.
Go the community has put out a few package managers but they haven’t really settled on one that wins yet. I think because in part the build everything from tip worked remarkably well for a long time. For most of my projects it still works.
The problem with a language solving the whole package management thing is that almost no project or app is a single language application anymore. Which means that in the context of an application what you really need is a package manager that understands javascript, go, python, … And when you widen the scope to systems it gets even more crazy.
At some point you end up with a package manager that is overlaying the language package managers and trying to resolve the dependency graphs between the ruby gem and the npm packages. And neither rubyhems nor npm really knows anything about the others dependency graphs.
This same principle holds for build systems as well. For an application you need to build both javascript and {Go,Ruby,Python,…}. Indeed in some sense in the Ruby and Python case the “build” and the “packaging” are the same thing sense they aren’t compiled.
https://golang.org/cmd/go/#hdr-Download_and_install_packages_and_dependencies
This is a case of ambiguous terminology. The package here is the one defined by the
package
keyword. Which is not at all the same thing as python’s pip, Ruby’s gems, or Nodes npm. It’s not a manager it’s a convenient fetcher if you happen to have stored your go package’s source at the correct location and named your import path correctly. But it is not a package manager in the sense that this article uses the term.You say tomato I say tomaaaato. I don’t really see the difference in what this article talks about and go get.
go get
downloads and builds dependencies and builds them. This is a primary function of most package managers. It happens to be really terrible at the job, but it’s still solving the same function.It’s really good at downloading and building the dependencies of a package. What it isn’t good at and in fact doesn’t do are versioning and resolving dependency conflicts which is I think the piece that makes a package manager useful for most people. Otherwise all package managers would just be a thin veneer over wget and the compiler.
Go get is a thin veneer over wget and the compiler and I believe was specifically made to be no more than that. i.e. the go developers didn’t want to solve the package manager problem with go get. They just wanted to make it easy to get and compile head for a Go package in the language spec sense of the term.
I just don’t think you’re making a very meaningful or useful distinction. But to each their own.
See nix: http://nixos.org/nix/
I already used it to replace most of the package managers you’ve listed on my system. I’m also looking into using it to replace npm via https://github.com/adnelson/nixfromnpm (generating flat dependency trees instead of the insane thing npm does)
Not to mention the Haskell infrastructure is kickass, and way beyond what cabal and stack can do.
For people who use nix, this is already a solved problem! Just need more manpower to bring all of the package sets up to the quality of the Haskell ones.
Yep Nix is great, although it is not a build system, it at least makes it easy to call out to anything that builds. I really need to work on getting Nix to work on FreeBSD again :(
Bazel is a polyglot build system.
I love Bazel. I recently contributed Mono C# support to it since I’ve been doing .Net stuff lately and the msbuild tooling sucks.
I’m happier with the state of all those ecosystems than I am with C’s. I think I’m largly happy with it.
Nix could be nice if I could use it without changing OSes, and could trust it that I wasn’t going to run into something they haven’t covered.