If I find too many people adopting a certain software tool, I’d probably think it’s bad. It would require an in depth analysis to convince me otherwise, because the mechanics at work are well established.
The mechanics the author described, of a tool becoming popular just because of a feedback loop, don’t suggest that popular software is usually bad. Rather, they suggest merely that popular software is no better than randomly-picked software. If many people are adopting a certain software tool, that doesn’t mean it’s bad, that means it’s probably average.
In fact, it’s probably better than average. The article suggests that a tool’s popularity is purely a feedback loop, regardless of the tool’s quality. But when a tool grows in popularity, that can actually improve its quality. The project may draw more core contributors, who catch more bugs, or who bring expertise in competing software and apply it when designing changes to the tool.
The quality intrinsic to a tool is also not the only thing that matters (though of course it does matter). If two tools are basically identical, but one of them has a bigger community that can answer my questions better and that has already published plugins to integrate with the libraries I need, then the project with the bigger community is a better choice.
If two tools are basically identical, but one of them has a bigger community that can answer my questions better and that has already published plugins to integrate with the libraries I need, then the project with the bigger community is a better choice.
Wholeheartedly agree. Not to mention, some projects are started with the intent to collaborate (think open source, or even growing startups). If you want to get the most reach, you need to think about what languages/frameworks are popular so that you can maximize your chances for acquiring talent.
While I do agree that all languages and frameworks have their strengths and and you can’t label one as genuinely superior to another, making decisions based on community size isn’t a bad idea.
But when a tool grows in popularity, that can actually improve its quality. The project may draw more core contributors, who catch more bugs, or who bring expertise in competing software and apply it when designing changes to the tool.
It’s hard to tell if popularity improves the quality or not. I’ve been thinking a lot about it, and I hope I can write an analysis soon. You mentioned why popularity may increase the quality, but think about the inverse: what aspects of popularity can decrease the software quality? Sorry for the short reply and the lack of clarifications, I still need more time to wrap up my ideas on this subject.
This is not news at all, sadly. It’s been known and covered since at least 2013.
The only thing I wonder about is why people still get surprised about this. It’s facebook. It’s the company that’s tracking you on basically every website you visit. Even when you’re not logged in.
I think it’s wise to try to be aware of and not reinforce “facebook is creepy and everything they do is evil by default” bias. Any key presses, scrolling, or mouse movements on a website can be sent to the server. This is the nature of the browser. If you don’t want a website to have access to something, don’t type it into the website!
[Comment removed by author]
Don’t we already have them? You can send messages to someone with email (which, if you’re all old school like me and use a local client, doesn’t transmit anything until you send it). You can at least choose to build websites that don’t have any facebook integration, like lobsters. You have perhaps somewhat less choice in the sites you visit, though.
I’m entirely sure what you mean by “this sort of thing” or how one could build systems where it’s not possible. Do you want to disable javascript? Cross domain links? img tags?
Actually, I do block 3rd party js by default, and facebook and google are on a special blacklist for images and css. I love uBlock.
It’s the company that’s tracking you on basically every website you visit. Even when you’re not logged in.
Curious what you mean by this, the-kenny?
Thanks for bringing light to this. I had no idea such a thing had existed. I wish the article went into more details, particularly, how this remains safe? And how people get access to it?
Their motivation for increased privacy became acute after it was revealed that the National Security Agency was collecting individual data through backdoors in traditional cloud services and ISPs. The Meta Mesh project requires that all traffic is encrypted.
I am not a network professional, but one node within these Mesh networks are hardwired and the rest are wireless. Curious how the that makes it more secure? I am hard pressed to believe that the NSA cannot get access to the information flowing through these networks.
I’m glad you liked it - I shared it because I was pleasantly surprised to see that the concern and confusion I’d heard around the topic was growing into real working systems. I figured it would at least be a nice reference of groups to look up.
I suppose their hopes for security against the global passive adversary are in promoting encrypted transport layers, avoiding compromised backbones for peer-to-peer services, and growing to the point that useful services are run on servers directly connected to mesh networks to similarly avoid backbones.
I am playing with React-Native. It’s a framework that lets you build native iOS applications with JavaScript. Don’t have any particular projects in mind, rather just exploring. Always be learning, what I always say.
If anyone is interested, this is a pretty good introduction tutorial. It was sent out and recommended via JSweekly.
Really enjoying these. Thanks!