1. 3

    This will make it incompatible with GPL:ed projects – right? As GPL does not allow any additional limitations?

    Reminds me of the classic JSLint license: https://en.wikipedia.org/wiki/JSLint

    That license had “The Software shall be used for Good, not Evil.” in it – which caused quite a few problems.

    1. 6

      Not just GPL; it violates the FSF’s definition of Free Software:

      The freedom to run the program as you wish, for any purpose (freedom 0).

      It violates the Open Source Initiative’s definition of open source:

      1. No Discrimination Against Persons or Groups

      The license must not discriminate against any person or group of persons.

      1. No Discrimination Against Fields of Endeavor

      The license must not restrict anyone from making use of the program in a specific field of endeavor. For example, it may not restrict the program from being used in a business, or from being used for genetic research.

      It violates the Debian Free Software Guidelines:

      1. No Discrimination Against Persons or Groups

      The license must not discriminate against any person or group of persons.

      1. No Discrimination Against Fields of Endeavor

      The license must not restrict anyone from making use of the program in a specific field of endeavor. For example, it may not restrict the program from being used in a business, or from being used for genetic research.)

      In other words, it’s proprietary software (with source available)

      1. 1

        Reminds me of the need for features like what I proposed in this RFC to Yarn: https://github.com/yarnpkg/rfcs/pull/76 Will try to find time soon to take up work on that one again.

        Also: Here is the thread from the previous time this happened: https://lobste.rs/s/eyyiav/npm_package_is_stealing_env_variables_on

        1. 4

          Many of the features they claim that IRC doesn’t support it does in fact support nowadays as IRCCloud and others are cooperating on creating modern IRC specifications that make IRC more on par with the Slack experience than classic IRC is. Link: https://ircv3.net/

          1. 2

            Unfortunately, I don’t have much faith with IRCv3 - it’s barely been implemented, and one of the IRCv3 people I talked to left and gave up on it - he’s now supporting Matrix, due to it basically being an independent JSON reimplementation of a proposed binary replacement idea for IRCv3 that resolved much of its fundamental problems, and free of IRC “culture.”

            1. 1

              A pity if so :/ Progressively enhancing IRC was a way that I found promising

          1. 26

            Wow. Just wow. Selected citations from comments:

            This destroyed 3 production server after a single deploy!

            Make a pull request and help out!

            Not a single pull request was merged in the last 2 months that came from an outside contributor. There are currently over 70 PRs open and none of them have any activity from the npm team.

            How about we give the two person team more than 24 hours to run npm unpublish npm@5.7.0?

            I’m not sure if you’re joking, but that command only allows unpublishing versions published within 24 hours, and not older.

            1. 5

              They were not kidding about these PRs and no activity from the npm team.

              If you look at the last 2 years commit chart, It really shows a huge disparity.

              How long can a project like this go without accepting or even commenting on others attempt of contributing to their software?

              1. 2

                They’re not a lot better about issues either. I submitted an issue about NPM 5 breaking stuff 8 months ago. Nobody ever responded, and the problem persists.

                (In case anyone from NPM is listening: https://github.com/npm/npm/issues/17391)

                1. 3

                  My experience is that yarn is very good at responding to issues and in accepting PR:s – so probably better to go there if one wants to fix or improve some aspects of an npm cli client

                  1. 0

                    I’m sure Yarn is a lot better on a technical level… but I’m really not comfortable using a Facebook product.

                    1. 3

                      Introducing Yarn: a new package manager for JavaScript from @fbOpenSource, @tildeio, @googledevs & @exponentjs.

                      https://twitter.com/yarnpkg/status/785857780838232064

                      So much more of a community project than say eg. React

            1. [Comment removed by author]

              1. 4

                This has nothing to do with the npm registry? It has everything to do with the implementation of a specific client of that registry? Switching to a different registry would have no impact whatsoever?

              1. 32

                In the Hacker News thread about the new Go package manager people were angry about go, since the npm package manager was obviously superior. I can see the quality of that now.

                There’s another Lobster thread right now about how distributions like Debian are obsolete. The idea being that people use stuff like npm now, instead of apt, because apt can’t keep up with modern software development.

                Kubernetes official installer is some curl | sudo bash thing instead of providing any kind of package.

                In the meantime I will keep using only FreeBSD/OpenBSD/RHEL packages and avoid all these nightmares. Sometimes the old ways are the right ways.

                1. 7

                  “In the Hacker News thread about the new Go package manager people were angry about go, since the npm package manager was obviously superior. I can see the quality of that now.”

                  I think this misses the point. The relevant claim was that npm has a good general approach to packaging, not that npm is perfectly written. You can be solving the right problem, but writing terribly buggy code, and you can write bulletproof code that solves the wrong problem.

                  1. 5

                    npm has a good general approach to packaging

                    The thing is, their general approach isn’t good.

                    They only relatively recently decided locking down versions is the Correct Thing to Do. They then screwed this up more than once.

                    They only relatively recently decided that having a flattened module structure was a good idea (because presumably they never tested in production settings on Windows!).

                    They decided that letting people do weird things with their package registry is the Correct Thing to Do.

                    They took on VC funding without actually having a clear business plan (which is probably going to end in tears later, for the whole node community).

                    On and on and on…

                    1. 2

                      Go and the soon-to-be-official dep dependency managment tool manages dependencies just fine.

                      The Go language has several compilers available. Traditional Linux distro packages together with gcc-go is also an acceptable solution.

                      1. 4

                        It seems the soon-to-be-official dep tool is going to be replaced by another approach (currently named vgo).

                      2. 1

                        I believe there’s a high correlation between the quality of the software and the quality of the solution. Others might disagree, but that’s been pretty accurate in my experience. I can’t say why, but I suspect it has to do with the same level of care put into both the implementation and in understanding the problem in the first place. I cannot prove any of this, this is just my heuristic.

                        1. 8

                          You’re not even responding to their argument.

                          1. 2

                            There’s npm registry/ecosystem and then there’s the npm cli tool. The npm registry/ecosystem can be used with other clients than the npm cli client and when discussing npm in general people usually refer to the ecosystem rather than the specific implementation of the npm cli client.

                            I think npm is good but I’m also skeptical about the npm cli tool. One doesn’t exclude the other. Good thing there’s yarn.

                            1. 1

                              I think you’re probably right that there is a correlation. But it would have to be an extremely strong correlation to justify what you’re saying.

                              In addition, NPM isn’t the only package manager built on similar principles. Cargo takes heavy inspiration from NPM, and I haven’t heard about it having a history of show-stopping bugs. Perhaps I’ve missed the news.

                          2. 8

                            The thing to keep in mind is that all of these were (hopefully) done with best intentions. Pretty much all of these had a specific use case… there’s outrage, sure… but they all seem to have a reason for their trade offs.

                            • People are angry about a proposed go package manager because it throws out a ton of the work that’s been done by the community over the past year… even though it’s fairly well thought out and aims to solve a lot of problems. It’s no secret that package management in go is lacking at best.
                            • Distributions like Debian are outdated, at least for software dev, but their advantage is that they generally provide a rock solid base to build off of. I don’t want to have to use a version of a python library from years ago because it’s the only version provided by the operating system.
                            • While I don’t trust curl | sh it is convenient… and it’s hard to argue that point. Providing packages should be better, but then you have to deal with bug reports where people didn’t install the package repositories correctly… and differences in builds between distros… and… and…

                            It’s easy to look at the entire ecosystem and say “everything is terrible” but when you sit back, we’re still at a pretty good place… there are plenty of good, solid options for development and we’re moving (however slowly) towards safer, more efficient build/dev environments.

                            But maybe I’m just telling myself all this so I don’t go crazy… jury’s still out on that.

                            1. 4

                              Distributions like Debian are outdated, at least for software dev,

                              That is the sentiment that seems to drive the programming language specific package managers. I think what is driving this is that software often has way too many unnecessary dependencies causing setup of the environment to build the software being hard or taking lots of time.

                              I don’t want to have to use a version of a python library from years ago because it’s the only version provided by the operating system.

                              Often it is possible to install libraries at another location and redirect your software to use that though.

                              It’s easy to look at the entire ecosystem and say “everything is terrible” but when you sit back, we’re still at a pretty good place…

                              I’m not so sure. I forsee an environment where actually building software is a lost art. Where people directly edit interpreted files in place inside a virtual machine image/flatpak/whatever because they no longer know how to build the software and setup the environment it needs. And then some language specific package manager for distributing these images.

                              I’m growing more disillusioned the more I read Hacker News and lobste.rs… Help me be happy. :)

                              1. 1

                                So like squeak/smalltalk images then? Whats old is new again I suppose.

                                http://squeak.org

                                1. 1

                                  I’m not so sure. I forsee an environment where actually building software is a lost art. Where people directly edit interpreted files in place inside a virtual machine image/flatpak/whatever because they no longer know how to build the software and setup the environment it needs. And then some language specific package manager for distributing these images.

                                  You could say the same thing about Docker. I think package managers and tools like Docker are a net win for the community. They make it faster for experienced practitioners to setup environments and they make it easier for inexperienced ones as well. Sure, there is a lot you’ve gotta learn to use either responsibly. But I remember having to build redis every time I needed it because it wasn’t in ubuntu’s official package manager when I started using it. And while I certainly appreciate that experience, I love that I can just install it with apt now.

                                2. 2

                                  I don’t want to have to use a version of a python library from years ago because it’s the only version provided by the operating system.

                                  Speaking of Python specifically, it’s not a big problem there because everyone is expected to work within virtual environments and nobody runs pip install with sudo. And when libraries require building something binary, people do rely on system-provided stable toolchains (compilers and -dev packages for C libraries). And it all kinda works :-)

                                  1. 4

                                    I think virtual environments are a best practice that unfortunately isn’t followed everywhere. You definitely shoudn’t run pip install with sudo but I know of a number of companies where part of their deployment is to build a VM image and sudo pip install the dependencies. However it’s the same thing with npm. In theory you should just run as a normal user and have everything installed to node_modules but this clearly isn’t the case, as shown by this issue.

                                    1. 5

                                      nobody runs pip install with sudo

                                      I’m pretty sure there are quite a few devs doing just that.

                                      1. 2

                                        Sure, I didn’t count :-) The important point is they have a viable option not to.

                                      2. 2

                                        npm works locally by default, without even doing anything to make a virtual environment. Bundler, Cargo, Stack etc. are similar.

                                        People just do sudo because Reasons™ :(

                                    2. 4

                                      It’s worth noting that many of the “curl | bash” installers actually add a package repository and then install the software package. They contain some glue code like automatic OS/distribution detection.

                                      1. 2

                                        I’d never known true pain in software development until I tried to make my own .debs and .rpms. Consider that some of these newer packaging systems might have been built because Linux packaging is an ongoing tirefire.

                                        1. 3

                                          with fpm https://github.com/jordansissel/fpm it’s not that hard. But yes, using the Debian or Redhat blessed was to package stuff and getting them into the official repos is def. painful.

                                          1. 1

                                            I used the gradle plugins with success in the past, but yeah, writing spec files by hand is something else. I am surprised nobody has invented a more user friendly DSL for that yet.

                                            1. 1

                                              A lot of difficulties when doing Debian packages come from policy. For your own packages (not targeted to be uploaded in Debian), it’s far easier to build packages if you don’t follow the rules. I like to pretend this is as easy as with fpm, but you get some bonus from it (building in a clean chroot, automatic dependencies, service management like the other packages). I describe this in more details here: https://vincent.bernat.im/en/blog/2016-pragmatic-debian-packaging

                                            2. 2

                                              It sucks that you come away from this thinking that all of these alternatives don’t provide benefits.

                                              I know there’s a huge part of the community that just wants things to work. You don’t write npm for fun, you end up writing stuff like it because you can’t get current tools to work with your workflow.

                                              I totally agree that there’s a lot of messiness in this newer stuff that people in older structures handle well. So…. we can knowledge share and actually make tools on both ends of the spectrum better! Nothing about Kubernetes requires a curl’d installer, after all.

                                            1. 3

                                              I feel much more confident with https://yarnpkg.com/ – more confident with its code and more confident that its maintainers will listen to suggestions and PR:s.

                                              1. 4

                                                I’m probably way too optimistic but I see the natural outcome of this as open source hardware/software tractors :)

                                                1. 2

                                                  Seems to exist some such initiatives: http://opensourceecology.org/

                                                1. 3

                                                  So, please forgive my ignorance but reading all the negative responses here - isn’t the fact that we now have a protocol standard for distributed social media an all around good thing?

                                                  1. 9

                                                    The lack of standards has never been an issue – the lack of deployments, independent implementations, momentum and actual interoperability has always been an issue.

                                                    I remember implementing OStatus back in 2012 or so at Flattr, only to find that no client actually implemented the spec well enough to be interoperable with us and that people rather than spending time on trying to fix that instead wanted to convert all standards from XML to JSON, where some like Pubsubhubbub/WebSub took longer to be convert than others, leaving the entire emergent ecosystem in limbo. And later ActivityStreams converted yet again, from JSON to JSON-LD, but then I had moved on to the IndieWeb.

                                                    I find the IndieWeb:s approach to document patterns, find common solutions, standardize such common solutions as small focused, composable standards, and reusing existing web technology as far as possible much more appealing.

                                                    One highlight with that is that one can compose such services in a way where ones main site is even a static site (mine is a Jekyll site for example) but still use interactive components like WebMentions and Micropub.

                                                    Another highlight is that one as a developer can focus ones time on building a really good service for one of those standards and use the rest of them from the community. That way I have for example provided a hosted WebMention endpoint for users during the last 4 years without me having to keep updated with every other apec outside of that space, and the same I’m doing now with a Micropub endpoint.

                                                    Composability and building on existing web technologies also somewhat motivates the entire “lets convert from XML to JSON” trend – HTML is HTML and will stay HTML, so we can focus on building stuff, gaining momentum and critical mass and not just convert our implementations from one standard to the next while fragmenting the entire ecosystem in the process. That also means that standards can evolve progressively and that one can approach decentralized social networks as being a layer that progressively enhances ones blog/personal site one service at a time. Maybe first WebMention receiving? Then sending? Then perhaps some Micropub, WebSub or some Microformats markup? Your choice, it all doesn’t have to happen in a day, it can happen over a year, and that’s just okay. Fits well into an open source scene that wants to promote plurality of participants as well as implementations while also wanting to promote a good work/life balance.

                                                    1. 1

                                                      Unfortunately every time an ActivityPub thread makes it to a news aggregator like this, it always seems like there are some negative comments in the feed from some folks from the indieweb community. It kind of bums me out… part of the goal of the Social Working Group was to try to bridge the historical divide between linked data communities and the indieweb community. While I think we had some success at that within the Social Working Group, clearly divisions remain outside it. Bummer. :(

                                                      1. 1

                                                        Sorry for the negativity – it would help if posts like these presented the larger context so that people doesn’t interpret it as if “ActivityPub has won” which as you say isn’t at all the case, but which this thread here has shown that it can certainly be interpreted as and which the title of this submission also actually implies.

                                                        This gets even more important with the huge popularity of Mastodon as that’s a name many has heard and which they might think is the entirety of the work in that working group, which isn’t the case and is something that everyone has a responsibility in adequately portraying.

                                                        So sorry for the negativity, but it’s great that we both feel that it’s important to portray the entirety of the work of that group!

                                                    1. 1

                                                      I’m kind of excited about payment request, would that integrate with e.g. Apple Pay? Reducing the overhead to paying sites is one of the things that I think could turn the web around. I have wished that e.g. Firefox would put a “$1” button on their toolbar, which would allow you to just give the site a dollar. Practical problems aside, could really improve the best parts of the web.

                                                      1. 1

                                                        PaymentRequest does support Apple Pay and is also supported by Google, Samsung and Microsoft at least - so building a PWA with in-app purchases is very much possible now

                                                        As a side note, I actually built such a browser button that you mention when I was at Flattr + investigated ways to identify the rightful owner of that page so that they could claim the promise of a donation. We never got it to fully work on all sites, but it worked for some of the larger silos, like Twitter and GitHub, and also worked for those who had added rel-payment links to Flattr, but we/I investigated having it crawl peoples as well public identity graphs to try and find a connection between the owner of a page (through eg rel-author link) and a verifiable identity – like their Twitter account or maybe some signed thing Keybase-style. That ended up with the creation of https://github.com/voxpelli/relspider but the crawler was never fully finished (eg. smart recrawling was never implemented) and never put into production. I still like the idea though.

                                                      1. 3

                                                        Ugh. ActivityPub makes me sad – we have so many good, deployed solutions to 80%+ of the social networking stuff, and ActivityPub just ignores all prior art (including prior art by its creators) and does everything from scratch.

                                                        1. 3

                                                          Why in your opinion did ActivityPub “make it” while others have failed?

                                                          Disclosure: I contributed to Rstat.us for a while.

                                                          1. 2

                                                            How do you mean “make it”? You mean mastodon? Because mastodon got popular before it had implemented any ActivityPub, so that’s unrelated :)

                                                            OStatus and IndieWeb tech are still the most widely-deployed non-mastodon (and are partially supported by mastodon as well)

                                                            1. 1

                                                              Bah, I apologize for not being clear. By “make it”, I mean, why has ActivityPub been promoted as a standard instead of OStatus or IndieWeb or another attempt at a protocol for the same space?

                                                              1. 3

                                                                OStatus mostly described a best practice for using other standards in a way that created a decentralized social network – so it never really needed standardization on its own. That + that the people behind it moved towards a next generation standards instead, eg. identi.ca moving to pump.io

                                                                IndieWeb though is getting standardized by the very same group as has published this recommendation and eg. WebMention and Micropub has been recommendations longer than this one even.

                                                                1. 3

                                                                  Atom, PubSuHubBub (now WebSub), and Webmention are all standards with various bodies

                                                                  1. 1

                                                                    PubSuHubBub

                                                                    Seeing some silly things they did with regard to best practices I can’t really say I feel bad about this. Things like using GETs instead of POSTs (if memory serves correctly) because of legacy stupid decisions.

                                                                    1. 1

                                                                      Yeah, Webmention was a W3C Recommendation for quite a while now even. I still don’t like how W3C standardized two ways of doing roughly the same thing…

                                                              2. 2

                                                                I think AP is an okay standard (although it, again, underspecifies a lot), but it doesn’t make anything possible that wasn’t already possible with OStatus, or some very simple extensions to it.

                                                                1. 1

                                                                  In what way did you think that ActivityPub did not learn from OStatus?

                                                                  1. 1

                                                                    so many good, deployed solutions to 80%+ of the social networking stuff

                                                                    For example?

                                                                    1. 3

                                                                      friendica, hubzilla, gnu social, pleroma

                                                                      1. 4

                                                                        pleroma

                                                                        Pleroma either currently supports or is very close to fully supporting AP, and was a pretty important goal from the outset.

                                                                        1. 4

                                                                          I know, I wrote it :)

                                                                          1. 2

                                                                            I think I follow you then :) Thanks for writing Pleroma <3

                                                                  1. 5

                                                                    Are there any lightweight ActivityPub implementations that aren’t Mastodon/GNU Social/et al? Every time I try to read the standard, it feels very heavy. I hope it’s not like WebRTC :(

                                                                    1. 10

                                                                      Bridgy Fed is an ActivityPub implementation that translates between Webmention and ActivityPub :)

                                                                      1. 3

                                                                        There’s an implementation report that includes lots of tools: https://activitypub.rocks/implementation-report/

                                                                      1. 2

                                                                        Let’s bring some context:

                                                                        This is a recommendation of the Social Web Working Group, a group that’s behind many specifications like this: https://www.w3.org/Social/WG#Specifications

                                                                        There are standards from both the IndieWeb side (WebMention, Micropub, WebSub) and the ActivityStreams side.

                                                                        The standards may overlap each other but it should be possible for sites and services to support both.

                                                                        1. 5

                                                                          I’m considering paying Pinboard for their web archiving feature, but so far it’s not been a huge pain point.

                                                                          1. 6

                                                                            I use Pinboard’s archiving, just for articles I’ve read and other things where I’d only be mildly annoyed if I lost them, it’s a bit too unreliable for anything else. The archiving time is sporadic, some things get archived in a couple of hours, others can take weeks, and many of my bookmarks say they’re archived but trying to open the archived page just causes an error.

                                                                            I still use it because it’s the only one I’ve found that will archive PDFs and direct links to images. Well, that, and because I paid 5 years in advance.

                                                                            1. 1

                                                                              Thanks for the review. It’s sad they don’t do the archiving at the moment of bookmarking. That’s what I feel is the best approach, but maybe they have so many users that reaching front of the queue takes week or so?

                                                                              Considering how you don’t think that good of Pinboard, I’m wondering why you went with buying 5-year service from the beginning.

                                                                              1. 2

                                                                                I already had a standard pinboard account grandfathered in from when it was a one-off fee, when I upgraded to an archiving account, and I had been happy enough with that. My thought process was I’d pay in advance and then I would have everything archived and I wouldn’t have to worry about it again for 5 years, I didn’t consider that it would turn out to be less reliable than I’d like.

                                                                            2. 2

                                                                              I pay for it and use it – my only regret is activating it so late, after having added bookmarks for years – that meant many many bookmarks had already vanished. (Thankfully Pinboard lists all such errors and the specific HTTP code that caused it)

                                                                              1. 1

                                                                                I like that they provide all error and HTTP codes. Are there logs too, so you can actually tell when the page stopped being reachable?

                                                                                1. 2

                                                                                  No, just the error and an option to manually trigger a retry.

                                                                                  It’s added as a machine tag like code:403

                                                                              2. 2

                                                                                I joined Pinboard almost exactly 7 years ago and it has already saved my butt a bunch of times. According to my profile page, about 5% of my bookmarks are dead links at this point.

                                                                                1. 1

                                                                                  It has to be reassuring. Well, they’re not only proving fun statistics, but they’re proving their value to you. I really haven’t heard about Pinboard until today. If there would be a local client for syncinc the archived content locally, then I could consider buying the service and using it, but first I would need to restore my habit of bookmarking that I somehow lost many years ago.

                                                                                2. 1

                                                                                  Interesting. I guess some bookmark-like service on top of archive.is / web.archive.org could be created. Or maybe there is even already such thing for free.

                                                                                1. 1

                                                                                  Some other examples of OWFa license:

                                                                                  1. 1

                                                                                    Great to see the work done on the OWFa +7 years ago to enable open reusable specifications be picked up and used to make things like GraphQL available for all, without risking any patent infringements from the people behind it.

                                                                                    1. 2

                                                                                      I use https://soverin.net/ Nice to have an email provider within Europe + one that focuses on privacy and the core feature rather than at a million other things

                                                                                      1. 5

                                                                                        The fact that things like this can happen has long been acknowledged by npm, but not much has happened. See this post from March 2016: http://blog.npmjs.org/post/141702881055/package-install-scripts-vulnerability

                                                                                        Did a RFC for Yarn myself a week ago to try address these very concerns by allowing one to opt-in just the modules one actually needs to run scripts and have the scripts of the rest be ignored: https://github.com/yarnpkg/rfcs/pull/76

                                                                                        1. 1

                                                                                          I like this idea of a common interface for tracing – that way libraries etc that wants to integrate with it doesn’t have to integrate with every tracing system separately, but can rather just implement the standard interface and leave it to each tracing system to have libraries compliant with that.

                                                                                          Seems really easy to get started using it with something like Jaeger and should be possible to write adapters for eg. AWS X-Ray as well.