The whole damn thing.
Instead of having this Frankenstein’s monster of different OSs and different programming languages and browsers that are OSs and OSs that are browsers, just have one thing.
There is one language. There is one modular OS written in this language. You can hot-fix the code. Bits and pieces are stripped out for lower powered machines. Someone who knows security has designed this thing to be secure.
The same code can run on your local machine, or on someone else’s machine. A website is just a document on someone else’s machine. It can run scripts on their machine or yours. Except on your machine they can’t run unless you let them and they can’t do I/O unless you let them.
There is one email protocol. Email addresses can’t be spoofed. If someone doesn’t like getting an email from you, they can charge you a dollar for it.
There is one IM protocol. It’s used by computers including cellphones.
There is one teleconferencing protocol.
There is one document format. Plain text with simple markup for formatting, alignment, links and images. It looks a lot like Markdown, probably.
Every GUI program is a CLI program underneath and can be scripted.
(Some of this was inspired by legends of what LISP can do.)
Goodness, no - are you INSANE? Technological monocultures are one of the greatest non-ecological threats to the human race!
I need some elaboration here. Why would it be a threat to have everyone use the same OS and the same programming language and the same communications protocols?
One vulnerability to rule them all.
Pithy as that sounds, it is not convincing for me.
Having many different systems and languages in order to have security by obscurity by having many different vulnerabilities does not sound like a good idea.
I would hope a proper inclusion of security principles while designing an OS/language would be a better way to go.
It is not security through obscurity, it is security through diversity, which is a very different thing. Security through obscurity says that you may have vulnerabilities but you’ve tried to hide them so an attacker can’t exploit them because they don’t know about them. This works as well as your secrecy mechanism. It is generally considered bad because information disclosure vulnerabilities are the hardest to fix and they are the root of your security in a system that depends on obscurity.
Security through diversity, in contrast, says that you may have vulnerabilities but they won’t affect your entire fleet. You can build reliable systems on top of this. For example, the Verisign-run DNS roots use a mixture of FreeBSD and Linux and a mixture of bind, unbound, and their own in-house DNS server. If you find a Linux vulnerability, you can take out half of the machines, but the other half will still work (just slower). Similarly, a FreeBSD vulnerability can take out half of them. A bind or unbound vulnerability will take out a third of them. A bind vulnerability that depends on something OS-specific will take out about a sixth.
This is really important when it comes to self-propagating malware. Back in the XP days, there were several worms that would compromise every Windows machine on the local network. I recall doing a fresh install of Windows XP and connecting it to the university network to install Windows update: it was compromised before it was able to download the fix for the vulnerability that the worm was exploiting. If we’d only had XP machines on the network, getting out of that would have been very difficult. Because we had a load of Linux machines and Macs, we were able to download the latest roll-up fix for Windows, burn it to a CD, redo the install, and then do an offline update.
Looking at the growing Linux / Docker monoculture today, I wonder how much damage a motivated individual with a Linux remote arbitrary-code execution vulnerability could do.
Sure, but is this an intentional strategy? Did we set out to have Windows and Mac and Linux in order that we could prevent viruses from spreading? It’s an accidental observation and not a really compelling one.
I’ve pointed out my thinking in this part of the thread https://lobste.rs/s/sdum3p/if_you_could_rewrite_anything_from#c_ennbfs
In short, there must be more principled ways of securing our computers than hoping multiple green field implementations of the same application have different sets of bugs.
A few examples come to mine though—heartbleed (which affected anyone using OpenSSL) and Specter (anyone using the x86 platform). Also, Microsoft Windows for years had plenty of critical exploits because it had well over 90% of the desktop market.
You might also want to look up the impending doom of bananas, because over 90% of bananas sold today are genetic clones (it’s basically one plant) and there’s a fungus threatening to kill the banana market. A monoculture is a bad idea.
Yes, for humans (and other living things) the idea of immunity through obscurity (to coin a phrase) is evolutionarily advantageous. Our varied responses to COVID is one such immediate example. It does have the drawback that it makes it harder to develop therapies since we see population specificity in responses.
I don’t buy that the we need to employ the same idea in an engineered system. It’s a convenient back-ported bullet list advantage of having a chaotic mess of OSes and programming languages, but it certainly wasn’t intentional.
I’d rather have an engineered, intentional robustness to the systems we build.
To go in a slightly different direction—building codes. The farther north you go, the steeper roofs tend to get. In Sweden, one needs a steep roof to shed show buildup, but where I live (South Florida, just north of Cuba) building such a roof would be a waste of resources because we don’t have snow—we just need a shallow angle to shed rain water. Conversely, we don’t need codes to deal with earthquakes, nor does California need to deal with hurricanes. Yet it would be so much simpler to have a single building code in the US. I’m sure there are plenty of people who would love to force such a thing everywhere if only to make their lives easier (or for rent-seeking purposes).
We have different houses for different environments, and we have different programs for different use cases. This does not mean we need different programing languages.
I would hope a proper inclusion of security principles while designing an OS/language would be a better way to go.
In principle, yeah. But even the best security engineers are human and prone to fail.
If every deployment was the same version of the same software, then attackers could find an exploitable bug and exploit it across every single system.
Would you like to drive in a car where every single engine blows up, killing all inside the car? If all cars are the same, they’ll all explode. We’d eventually move back to horse and buggy. ;-) Having a variety of cars helps mitigate issues other cars have–while still having problems of its own.
In this heterogeneous system we have more bugs (assuming the same rate of bugs everywhere) and fewer reports (since there are fewer users per system) and a more drawn out deployment of fixes. I don’t think this is better.
Sure, you’d have more bugs. But the bugs would (hopefully) be in different, distinct places. One car might blow up, another might just blow a tire.
From an attacker’s perspective, if everyone drives the same car, it the attacker knows that the flaws from one car are reproducible with 100% success rate, then the attacker doesn’t need to spend time/resources of other cars. The attacker can just reuse and continue to rinse, reuse, recycle. All are vulnerable to the same bug. All can be exploited in the same manner reliably, time after another.
To go by the car analogy, the bugs that would be uncovered by drivers rather than during the testing process would be rare ones, like, if I hit the gas pedal and brake at the same time it exposes a bug in the ECU that leads to total loss of power at any speed.
I’d rather drive a car a million other drivers have been driving than drive a car that’s driven by 100 people. Because over a million drivers it’s much more likely someone hits the gas and brake at the same time and uncovers the bug which can then be fixed in one go.
Sounds a lot like https://en.wikipedia.org/wiki/Genera_(operating_system)
Yes, that’s probably the LISP thing I was thinking of, thanks!
I agree completely!
We would need to put some safety measures in place, and there would have to be processes defined for how you go about suggesting/approving/adding/changing designs (that anyone can be a part of), but otherwise, it would be a boon for the human race. In two generations, we would all be experts in our computers and systems would interoperate with everything!
There would be no need to learn new tools every X months. The UI would familiar to everyone, and any improvements would be forced to go through human testing/trials before being accepted, since it would be used by everyone! There would be continual advancements in every area of life. Time would be spent on improving the existing experience/tool, instead of recreating or fixing things.
I would also like to rewrite most stuff from the ground up. But monocultures aren’t good. Orthogonality in basic building blocks is very important. And picking the right abstractions to avoid footguns. Some ideas, not necessarily the best ones:
proven correct microkernel written in rust (or similar borrow-checked language), something like L4
proven correct microkernel written in rust (or similar borrow-checked language), something like L4
A solved problem. seL4, including support for capabilities.
seL4 is proven correct by treating a lot of things as axioms and by presenting a programmer model that punts all of the difficult bits to get correct to application developers, making it almost impossible to write correct code on top of. It’s a fantastic demonstration of the state of modern proof tools, it’s a terrible example of a microkernel.
FUD unless proven otherwise.
Counter-examples exist; seL4 can definitely be used, as demonstrated by many successful uses.
The seL4 foundation is getting a lot of high profile members.
Furthermore, Genode, which is relatively easy to use, supports seL4 as a kernel.
Someone wrote a detailed vision of rebuilding everything from scratch, if you’re interested. 1
I never understood this thing.
I think that is deliberate.
And one leader to rule them all. No, thanks.
Well, I was thinking of something even worse - design by committee, like for electrical stuff, but your idea sounds better.
We already have this, dozens of them. All you need to do is point guns at everybody and make them use your favourite. What a terrible idea.
Location: Remote, or Seattle, WA, USA
Type of Work: Full Stack, Backend
Hours: Full Time
I’m looking for a long term, full time job with a competitive salary and good benefits. I have been working remotely for 13 years now, and I would prefer to stay that way. However, I am open to local jobs in Seattle, WA if there is the possibility of some remote days.
Most of my LAMP work is proprietary and behind corporate logins, but I can provide samples from personal projects if requested (these are not on my GH account). I have an interest in programming languages and have a variety of side projects in Python, Rebol and Pony. I am more than willing to put in the time to learn a new language and/or framework if the job requires it.
I support removing downvotes in place of report option for spam, trolls, etc. An upvote-only system already puts most of the good comments above the others. The rest remain so they can be reviewed by anyone with an open mind. Most of us Internet users will effortlessly skip past trolling or spam, too. The main effect I see downvotes have both here and Hacker News is simply censoring dissent. I’ve found some accurate, but unpopular, comments close to the bottom of threads that were rated at 1, 0, or greyed out if on HN. I usually bumped them back up with supporting evidence in my own comments. However, they could’ve disappeared entirely if the censorship had been stronger.
My old government teacher taught a lesson I’ve seen repeatedly since in voting or reputation systems: Tyranny of the Majority. The tyranny can only reach its highest damage when the majority can make dissent disappear. Like with downvotes.
Yes, I agree with this too. Trolls and spam are best dealt with by a human moderator (that can remove the spam, lock the thread from further replies, contact the user or add an informed opinion) and a reporting option would be the best way to deal with this. The reporting option would do nothing more than alert a moderator, who can then take the appropriate action. Dealing with trolls/spam does not work well in an automated system or any kind of system that encourages regular users to respond further, thus enabling the troll/spam to continue. All other downvote options can be removed and the upvote button should work well enough for keeping the good comments above the others.
Hey guys, this is a personal project of mine that I’ve been working on for a while. It’s a place where you can create your own online community, similar to Reddit, but with more of a focus on content rather than links. One of things I do differently there is make search the primary way to find content. As a board owner you are able to create your own custom categories and tag filtering system that users can use to search on.
Some additional highlights …
Is the source available anywhere?
No, it’s not open source right now. I would like to do this eventually, but not at this early stage.
Nice work - I’ve only had a quick browse around but it looks like a lot of work has gone into it.
One thing I miss in all forums is a bidirectional email gateway - I’d love to be able to subscribe to a forum (or subforums within that form) and be able to interact with it via email. This means receiving an email for each post (with proper headers so that my MUA can thread) and being able to post by replying to emails. Yahoo Groups and Google Groups do that, but none of the open source forum software I’ve used does. Which sucks. But maybe it’s just me that wants that kinda thing? I just find interaction by email so much faster than having to login to 20+ forums and read the threads I’m interested in. Yes, I do miss the heydey of Usenet, damnit.
Hehe, I do miss Usenet too, damnit :) Yea, I’ve heard this idea before on a few mailing lists I used to subscribe to, so I don’t think you’re alone in wanting it. But unfortunately it would be quite a bit of work to implement. Integrating with an email server is not easy, but definitely going to note this down as something to scope out in the future. Thanks!
We’ve talked about our dissatisfaction with the traditional web forum on… this forum.
Interesting, thanks for the link!
I downloaded your notes from that thread, and I would like to address some of the things you mentioned and explain how they are done in GrokBB. I will read the rest tonight and provide more feedback, but here are some of the points …
Conversations are often lost, as they get buried in Page 12/60 of the thread or forum. “Thread necromancy” is discouraged, but this only encourages more posts on the same topic to fill up the forum.
In GrokBB, the topics display the top level comments only by default and everything else is considered a side conversation and automatically hidden. If you want to follow a side conversation, you can expand it and respond there, and then those replies get turn into a flat model (but you retain the ability to quote users so you can follow the responses easier). Replies can also be sorted differently if you want (oldest, newest, by most responses, by most times someone saved it to read later - i.e. a possible indicator of quality).
This is my attempt at keeping topics On Topic and making the most relevant replies the most visible, well, that’s the theory anyway. There hasn’t been enough users on the site yet to fully test this model. Also, a few months back I had a feature in place that would automatically split topics if they got too many levels deep, but it started to feel like a forced thing and was making it hard to have conversations, and so that was disabled, but it may be something to revist again in the future and possibly make it something the topic creator or moderator controls.
What about tags? They’re often present, but used wrong, or not used at all. Tags are useful for organization, but enforcement is rare and not frequently used by users.
In GrokBB, I have created a system that lets the board owner create the tags relevant to their community, and then moderators can assign them to topics as needed and they can’t be removed. This way you can be sure the tag is appropriate and not someone trying to spam their own topic with irrelevant tags. Users can also tag topics themselves, but they are only visible to themselves, and so they can use them later to search on. This gives them the ability to create their own system for what they think is Insightful, Funny. etc and not affect other users. I would like to expand on this and possibly see what other users are tagging topics with, and so maybe you can find topics that other users find funny, but this still needs more thought.
Organization is often poor. Threads are often placed in the wrong forum, don’t fit one particular subforum, or fit several.
In GrokkBB, there is a category system implemented that user have to assign their topic to. It is more high level than tags and really a replacement for the sub-forum concept.
Content is frequently full of infighting, or community in-jokes useless to a visitor.
This is a hard one to address. I have been thinking about a system where the topic creator can mark the replies they feel is most appropriate to the topic and then users could have a way to filter and only see those ones. Then the most useful answers / replies are displayed at a glance. This hasn’t been implemented yet, but is something I have started to think about.
Users have little voice for personalization and self-expression.
In GrokBB, you have the ability to specify an avatar and to create a profile page where users can enter any content they want, but that’s fairly standard. I would be interesting in hearing what kind of self-expression you think users should be able to do. I am all for giving users more tools, though I’m not sure what that could be.
Many forums have cruft and fluff such as post count and join date that only serves to create an imaginary number people must increment and elitism. This is not useful to the community nor to visitors.
I think this information is useful for moderators, but yes, other information is more important to determine quality and the user’s contribution to the board. In GrokBB, I have create a badge system that allows moderators to award users SVG badges for their contributions. They can also award the user moderator points, which increases their reputation for that board, and so visitors can see, at a glance, who the best contributors are.
These things are strictly controlled by moderators only, because they are the best indicator of what is quality content for their community. There is no up/down voting system because, in my opinion, I feel it is a completely useless system that usually represent a person’s bias more than quality content. Instead a user’s reputation is based on the actions they actually do (what content you create and if moderators feel your content is of good quality). This could be expand on some more, and I would be interested if anyone has additional ideas around this.
Having said all that, this site is Beta right now and probably will be for some time. I’m very open to adding additional features if anyone is wanting to work with me on brainstorming and testing different concepts out.
I read the rest of your PDF and the other posts, and I have some more input if anyone is willing to discuss it further …
Touching on this issue again, I noticed one user mentioned an idea about “every thread gets a block of wiki content on the top to aggregate knowledge” … and this could be another way of dealing with that issue. It’s basically a TLDR of all the replies at the top, so new visitors could see the important points that have already been covered.
On a larger timescale, I feel like all of those—HN/lobsters/Facebook (and Reddit, Slashdot, etc.)—are very oriented towards the “hiding” part, with any content older than a week or so virtually invisible (often a day or two). So discussions rarely last long.
I think this is a major problem with the current sites out there now. Discussions don’t get into any kind of depth because they get hidden after a few days by newer posts, even when the replies are recent. In GrokBB, I bump the topic back up to the top when a new reply has been added, similar to how the older style forum software do it… but I’d be interested in hearing anyone else’s ideas on fixing this.
What this ultimately means is forums, while useful for inquiries and community building, are a poor fit for actual resources -stuff a user might want. The communities are insular and resources poorly placed. They deserve more than stickies. Having both discussion services (a forum, IRC) and some form of resource service (a Wiki, content management) would be nice, or could be combined? Another problem is that resources are often rarely fixed by the community/visitors/administration unless that is the site’s primary focus, often making the discussion service the primary resource location, which is not ideal.
In GrokBB, I have implemented a live, chat system right into the site, and eventually each board will have their own private chatroom too, no IRC needed (while I love IRC, it’s like pulling teeth to get anyone to use it in 2016). I think that helps with moderation (allows moderators to discuss issues in real time), with new users (who want to ask questions or are unsure about rules) and with getting a quick answer on a community related topic without having to create a whole new topic… but this only addresses part of the resource issue. Forums can create a lot of useful content, but it’s scattered in bits and pieces throughout different topics. I can’t say I have a solution to this, but I have some possible ideas …
maybe the forum can be integrated with GitBook in some way, and the useful information can be compiled and written up in a standard, more professional format and these books can be provided as official resources (but the issue here is who is going to do that work)
maybe topic creators become responsible for their content and they get reminded to select the replies that are most relevant to the discussion and these automatically get transferred to some kind of wiki where it can be edited and refined further by the community
the issue of handling file resources is a hard one to address too, download links go dead all the time, and so ideally they should be uploaded to the forum and maybe versioned in some way (or maybe an integration with a 3rd party, distributed file system might work too). Also, there should be a central “gallery” to download them from too (so you don’t have to scour through different topics to find the files) and for that they need to be automatically organized in some way. One idea might be to inherit the category and tags assigned to their topic and this automatically creates a “directory structure” for them.
I thought it might be a bit of a stretch for Lobsters, but the article was pretty interesting and you did have a book tag ;) so I thought I’d give it a try. Will keep it tech focused going forward.
I attempted to write a command line, SQL client with it a few years back and was quite impressed with how easy it was. The hardest part was integrating ncurses, but that’s probably true in any language that uses that library, lol. It is up there as one of my favorite languages.