Wow, I had no idea that CloudFlare makes it that much more annoying for non-EU/US/CN internet users to use sites behind their services. I should have taken the time to be more aware of this, as much of my extended family lives in SE Asia (mostly the Philippines).
I’m now seriously reconsidering using it for any future web apps. I mean, yes, there are abusers in areas that don’t have CF data centers, but punishing an entire segment of internet users seems all sorts of terrible to me (plus, it’s not as if potential spammers/DDoSers don’t operate from within the EU/CN/US regions).
This has some serious implications for knowledge access across the globe.
This was actually one of my first introductions to org-mode
. Excellent resource!
Just wanted to add that the leuven
theme is great for org-mode
.
Yeah, I’m still not sure whether this is a joke or what. “hipster-free” made me lean towards it being a joke, but who knows. Individually, these are some solid tech choices. But together? Web programming in C? That’s so early 2000s, which is to say, didn’t we move from that for a reason?
I wonder if it could be http://c2.com/cgi/wiki?HaHaOnlySerious
The sentiment of this web page is delightful but doing web programming in C or any other language where you do your own memory management is asking for so much trouble. You’re trading the incidental complexity of [insert hipster web language + framework here] for the incidental complexity of valgrind, asan, tsan, etc.
I wish there was an efficient memory-safe language that had the history and cachet of C among communities with the OpenBSD ethos - like Pascal or Oberon, had they survived the 90s in any meaningful sense. At least one of the useful languages from the current PL renaissance in industry will undoubtedly be in this position in 20 or 30 years - C won’t keep eating its children forever :)
I imagine it could be rust, but there’s a degree of fetishism which I think is detrimental. People learned C without first having to learn to love C. There’s a trend now, however, that you can’t just learn a tool, you must love it. The hype turns off a lot of potential practitioners.
There are some other not C languages I enjoy. But I arrived here without people telling me how much better life would be if only I made the switch.
The Rust community is definitely strongly in favor of its language of choice, and while I think the core of the community (the Rust teams, developers of major Rust libraries, major language contributors) are appropriately careful about the rhetoric they use and claims they make, others in the community are often not so careful. Part of this comes from misunderstanding of the guarantees Rust provides (“no data races” becomes “no race conditions,” for example, which is a stronger claim), and part of this comes from people being imprecise in the words they use, even if they do understand the actual guarantee being made.
I am looking forward to the results of the Rust Belt project, which is looking to formalize Rust, and provide a stronger and clearer description of what sort of guarantees Rust provides. There also ongoing efforts by the Rust teams to provide a stronger description of what behaviors are considered safe vs. unsafe, and to generally tighten up both their own understand of the language, and to improve the explanations official Rust documents provide. My hope is that this formalization will provide something clearer and stronger to reference when people discuss Rust and what it can offer them.
EDIT: I should also say that I think a lot of people in the Rust community come by the excitement honestly. Personally I can be very effusive about Rust, and have to regularly remind myself to tone it down when pitching Rust to people. It is a cool language that does a lot of things I appreciate, and that feels “right” to me in a way that engenders a strong desire to encourage its use elsewhere, if only to give me more opportunities to write in it.
I mostly agree with tedu on this. I like rust but there is a huge “fetishism” right now in the hype cycle. Reminds me a lot of what golang went through up until people realized where go didn’t fit in.
Different languages sure, but yes everyone is excited but all the “you not writing in rust is bad and you should feel bad” (i’m paraphrasing) articles start to make me want to throw out regular old C more. Its not great, but after 40+ years with some minimal tools you can fairly easily mitigate most problems.
An example of something Rust doesn’t provide but gets brushed over far too often: guarantees that you don’t have a memory leak. Memory leaks are by definition “memory safe”, but that isn’t all that interesting when it occurs. Its great it won’t break the program, but rust isn’t solving every class of problems around memory allocation. It is really tough to pierce the fanboy attitudes around the enthusiasm. It is great people are enjoying it, but lets come down to reality and evaluate based on facts not “c isn’t memory safe”. On its own that is correct as a statement, but just as meaningless as saying having a memory leak is “memory safe” IMO.
Don’t get me wrong, rust is a good language, just could use more of the enthusiasm at a 7 not an 11. I’m not going to convert the 3k lines of C I’ve written this year to rust just because rust exists. (kernel module so no rust isn’t a great option even at this stage)
Rust is my go-to language right now, and I also agree with tedu. My point was trying to a) clarify that, unfortunately, the broader community is not as careful and measured in their proselytizing for Rust as the core community is, and b) this will almost certainly get better over time, particularly as Rust’s guarantees are give a more precise and formal treatment.
Yep no worries, my bigger gripe is “memory safe” is really quickly turning into a rust thought terminating cliche at times.
I think that “fetishism” is typical of early adoption of most tools. I would love to be able to read the discourse around C and Unix between 1970 and 1980, for example, where I imagine C went through the same hype cycle in a relatively tight-knit community, so it left few artifacts.
It’s so rare that a tool is truly unequivocally better than its predecessors that effusive praise tends to help people tamp down the cognitive dissonance of recognizing the areas in which it’s worse. I take your point, though - it would be nice if the discourse around new tools wasn’t this way.
Within living memory, I can recall python going from being an also-ran to mainstream, but with very few people constantly telling me “I can’t believe you’re not using python already.” I do believe there has been a change in attitude that didn’t exist before. Online communities grant status and standing to pretenders. Just look at how easy it is to gain karma by shit talking PHP, even when one has had zero experience with PHP, or any programming language!
“I wish there was an efficient memory-safe language that had the history and cachet of C among communities with the OpenBSD ethos”
ocaml comes close to being that. It’s a language with a strong type system (with type inference and everything), automatic memory management, and a very straightforward compiler. I don’t know that any C compiler beats ocamlopt in how easy to debug the generated code is.
There is even a book on Unix system programming in OCaml :):
You could slot Oberon (or OCaml, Go, LuaJIT, Rust - CGI scripts don’t discriminate) into this stack right now and it would work just fine - the fun of using C for this is that there’s some unquantifiable value in saying “we use one language for the bulk of work on or in this system moving forward, and it’s a language that’s suitable for any job you can throw at it.”
Would Go be an alternative? e.g. this seems convincing (I’ve only skimmed it, though).
Lua is a very viable alternative - all the power of C, none of the issues. OpenRESTY and LAPIS both run very well on OpenBSD too ..
If you haven’t considered Go, I’d encourage you to do so. It checks the boxes. It’s a memory safe language, not too far away from C, relatively efficient. And to top if off, it does provide builtin tooling to supplant tsan via go build -race
.
If that’s not your cup of tea, Rust seems to have all those benefits of memory safety and thread safety.
C++? The best thing about C++ is “You only pay for what you use”. It is as C like or Java like as you choose. At my work we use it in a very C like way, but we get the advantage of free RAII, basic containers like list, vector, map, and simple to use strings. This all means simpler, terser code which is easier to reason about, maintain, and debug.
Yeah, <space>
for play/pause seemed like a no-brainer to me (it’s basically a universal keybinding at this point—what else is that key even good for?), but apparently not to the dev. <c>
it is…
Home hosting: I will setup my new (5th generation) Intel NUC into a little home hypervisor to host fun stuff like a GitLab server and a Plex server. Very excited! It’s my first M.2 host. I can’t believe how physically tiny those SSDs are.
Web apps: I’ll be starting to write a couple small web apps for the radio station I used to run. The current web technologies in use are Flask and Laravel, with a history of Code Igniter and Rails. I might stick to Flask because it’s fun and reliable, or I might finally learn Scala. Either way, I’ll probably go with Postgres this time (everything has been MySQL/MariaDB until this point).
Books: And most important of all, I need to find a copy of The Restaurant At The End Of The Universe. I finished Hitchhiker’s Guide last weekend and I haven’t stopped thinking about it since.
Hey, that sounds like fun!
Those M.2 SSDs are so tiny. And so fast. I have 256GB one in my Chromebook and while this is completely anectdotal, it certain feels faster.
I used to self-host ownCloud as an alternative to Google Drive. While it was nice for awhile, I can only afford to pay for the lowest VPS instance at vultr, which doesn’t give you that much storage. Plus, maintenance was worrying me a bit wrt to security; I simply don’t have as much time or money as I’d like to be able to run an ownCloud server anymore.
I use GitHub and haven’t migrated to something like GitLab because I like the social aspects of GitHub, e.g. being able to star or subscribe to a repo and such. It’s GitHub’s main selling point for me. Plus, self-hosting GitLab just isn’t an option at the moment, also due to time and money. Wait a minute; let me get back to you on this as I just visited gitlab.com and realized you can sign-up for a free account again.
This article strikes me as wholly unconvincing. I couldn’t even make myself read it all, because the first 2/3'rds or so seemed to be just a lot of assertions with no justification. At best it seems that this may be vacuously true in a pedantic / overly strict sense. “The brain doesn’t store representations of dollar bills” or what have you. That’s probably true. There’s no reason to think that you have an exact image of a dollar bill in your head at all times. That seems pretty irrelevant to me, as it appears that we must store at least some fuzzy representation of the dollar bill, in order to recognize it, or to describe it - from memory - to the amount of detail that we can.
But digital computers don’t necessarily have to work with exact representations either.. ergo all the recent successes we’ve seen on image recognition using artificial neural networks, etc.
Personally I suspect that the brain is a biological implementation of a sort of bayesian pattern matching system which does, indeed, share quite a lot with computers - unless you just define that away by saying “a computer is something that works differently from the way the brain does”.
As if a computer has an exact representation of a dollar bill. An EXACT representation would take an uncountable amount of resources.
This was my main contention with this article. No such “exact” representation of data occurs on a computer, either. It’s not as if people are saying our brains work exactly like computers, anyhow.
@mindcrime, the question of “representation” in the brain of course is very interesting and complicated, but as a starting point, I find the concept of complex feature selective neurons in the brain very fascinating and illuminating. A quick run down is here https://en.wikipedia.org/wiki/Grandmother_cell#Face_selective_cells . This topic was made very popular by the “Jennifer Aniston” cell. The who cell, you ask? That’s the risk you run of trying to popularize your research by tying it to the fickle star of popular culture :)
The Elements of Computing Systems: Building a Modern Computer From First Principles has been a great way to unify my background in computer science from logic, to architecture, to programming languages.
Last week I dropped my two remaining jobs and moved out to Southwest Virginia to work on a hop farm for the summer. We got a platform rigged up to make stringing simpler, and have cleared four of six rows, so far.
This week, we’ll finish the stringing, start training the vines, and hopefully get started installing lights. The farm’s owner is involved with research on recreating pacific northwest growing conditions for hops here in the southeast; the hops we’re growing this year will be undergoing a form of phototherapy to (hopefully) interrupt their night period for a couple of hours and prevent flowering until later in the season.
As far as personal projects go, I’m still learning OCaml, and trying to write prose and half a dozen tiny little programs that were almost done - simple OpenGL exercises, PnP RPG utilities, various introductory ‘toys,’ for the sake of fun and learning.
Considering a project (and possibly looking for collaborators):
I use stagit1 for personal project hosting (and github as social media but that’s irrelevant to this). I find stagit to be simple and minimal. Does the job. Does one thing and does it well. But I miss issues and PRs. Allowing collaboration from other people is pretty much impossible (asked people to format-patch, anyone?). So I am thinking of creating a “bundle” project which uses stagit and some mailing list software that allows anyone with his own vps and an ip/domain to quickly setup a github like personal project host with patches and discussion over emails with a public mailing list. So people won’t have to “register” to everyone’s site for contributing or opening an issue (which is mostly what keeps people from moving out of github or any other centralized service). They’d just use emails. If we can keep things consistent enough, it could offer the consistency of Github (or other hosts) while getting the decentralization of git back. Add some light css (see http://bettermotherfuckingwebsite.com/) and people wouldn’t find it dull or boring either.
Thoughts? Anyone feels like collaborating? Am I missing something crucial?
Edit: Just realized that this thread might be more about already accomplished/started things and not made up ideas. Apologies if this is offtopic for the thread.
It really looks like you’re describing Phabricator here (code reviews, issues, discussions, git hosting, simple design, simple hosting…), you should check that too.
I’m busy with exams right now, but this is something I’d be interested in following & possibly contributing to in the future :)
But I miss issues and PRs
Google has a project where they abuse git-notes as a code review tool. Maybe something similar can be done for issues?
Or maybe we should all just realise that what we really should be using is fossil-scm.
That is really clever. It seems like making a new GitHub repo for it will be a necessary evil though ;)
Thanks. It is maybe clever eyesight, not a clever idea in itself because git was meant to be used that way only.
Why would this require a GitHub repo?
I feel like that would be a better place than Lobsters to pool ideas and find out who wants to participate.
This looks like something I’d find pretty useful for https://eigenstate.org and https://myrlang.org.
At the moment, I mirror my code on github, and use their bugs/issues.
Trying to dig into Hypervisor.framework by hacking on xhyve.
Currently, I’ve written a utility (okay, C was overkill, I admit it) to manage xhyve virtual machines.
Could have been accomplished with bash scripts, and an existing xhyvectl
occurs, but I’m tired of existing projects deterring me from learning by doing my own thing, you know?
I was inspired to start working on GUI for xhyve that’ll be much like virt-manager on Linux.
Docker for Mac is indeed a game-changer. I was using
dlite
before, and while it was fine, having this integration directly in Docker is great. I dual-boot Ubuntu and OS X, and using Docker for Mac has brought a similar experience to docker on Linux. Really excited for its development!Would you say it’s a game changer because you can run docker stuff on localhost, or because you have servers that are running osx?
Docker for mac doesnt actually run docker on a mac, as far as i can tell. Not sure what the point is - if you’re writing linux software why not run linux?
It’s substantially more convenient than previous workflows for local development of Docker containers.
More convenient than just using a normal Linux box?
That… depends on personal preference? I am not in the habit of telling other people what OS to use, any more than I am in the habit of suggesting underwear brands to them. I don’t mean that you were doing that, but … The OS that one’s server-side components run on has pretty much nothing to do with the OS that one personally enjoys working directly on.
Absolutely. I code on the go a lot. My development machine needs to be a rock solid laptop with great power management, hibernation, and 10+ hours of battery life. I’m beyond done trying to fight with Linux and Lenovos of varying quality when OS X delivers out of the box. Life’s too short.
I think that’s where the trouble lies.
What does convenient mean here, and what is a normal linux box?
I’ve used linux since 1997 and the slackware days, I still prefer OS X for a workstation. To be honest I’d rather be running a bsd box over linux at this point.