1. 31

    All this talk about ethics, open, and free brings another angle to mind: people pushing that with no-cost licenses are themselves misrepresenting what they are achieving in at least U.S.. I used to push for OSS/FOSS in the past. Now I’m switching to hybrids. The reason is that encouraging people to play “give it all away” or “use low-revenue models” in a capitalist country where opponents of freedom make billions of dollars for their software shifted all the money (and therefore power) to the latter. They then paid off politicians and used pricey lawyers to win more power against OSS/FOSS in ways they couldn’t fight against without piles of money. This includes ability to patent/copyright troll users of open/free software and especially Oracle’s API ruling which jeopardizes OSS/FOSS, backwards-compatible implementations of anything that had a proprietary API.

    From what I see, OSS/FOSS have done great things but are a fundamentally-flawed model in a capitalist country where money wins. As many as possible need to be charging by default both to support contributors and send money/power the other way. They and FOSS-using companies that don’t depend on patent/copyright money need to pool money together to fight legal advances of patent/copyright-trolling companies that want lock-in. Otherwise, in a game where only one side is playing for keeps, the OSS/FOSS groups keep losing by default software freedoms and ability to enforce their licenses while preaching that they’re maintaining them. Seems dishonest. Also, strange I almost never read about these issues in FOSS writers articles about business model and licensing recommendations.

    Far as hybrids, I can’t give you the answer yet since it’s too soon. For FOSS, I’m looking at Open Core and Dual-Licensing with strongest copyleft available. For non-FOSS, Source-available from public-benefit companies and nonprofits chartered to implement most software freedoms for customers on top of free for non-commercial or under certain use. These freedoms and justifications would also be in licenses and contracts with huge penalties for non-compliance for extra layers of defense. Maybe expire into FOSS after certain time passes or revenue threshold. We need more experimentation that lets companies currently supplying or later releasing as FOSS to get millions to hundreds of millions in revenue collectively to fight this battle. Again, it’s not optional: the other side is actively fighting to remove software freedom inch by inch. And mostly winning despite FOSS organizations’ little victories.

    1. 5

      Apologies if this is a threadjack, but I’m wrestling with these kinds of questions. I’ve been doing open source more or less my whole career, and usually in some form of hybrid.

      Now I’m going out on my own and am searching for a model that makes sense. I like the collaboration of open source, but I also very much want to make money, and don’t want to do that with hallucinogenic business models.

      I’m building a game that has a music synthesizer in it as a core mechanic. What I’m building has basically 3 layers - infrastructure for building such things in Rust, the synthesizer itself (with GUI), and the game logic on top. What I’m converging on is doing the first two layers as very much community open source with permissive licenses, and the third layer as just straight up proprietary software, no pretending to be anything else. There’s stuff to fine-tune around the edges, for example somebody brought up a delayed open release of the game source, after the monetization has run its course, but I don’t want to commit to that right now because it might constrain working with a commercial publisher. If I end up self-publishing, I’ll strongly consider that though, especially if people tell me it helps motivate their contribution.

      1. 3

        I think you should flip your plans on business models around this code - the infrastructure and synthesizer are the things that could have enough value to a business that you could do well selling them. The median game does not make a profit and, as entertainment, games are a hit-driven (usually-)one-time purchase not bought based on predictable need.

        1. 4

          I’ve certainly thought about it. But here’s my thinking. First, there’s currently no business for Rust infrastructure, the community is very much organized around permissive licenses. Second, the market for synthesizers and music plugins is pretty crowded, while games in this particular genre are, as far as I can tell, underserved. Third, I think the Switch is a promising platform, and it doesn’t really do free games. Fourth, if the game is a dud but the free music tools catch on, I can always do a pro version, and I get free marketing and market research. Lastly, the game is definitely riskier, but I’m at a point where I’m ok with that; if this stuff doesn’t monetize, I just go back to a corporate job.

        2. 1

          I think it’s pretty obvious that the ‘correct’ way of doing free software games without violating the freedom of users by making anything proprietary is to make all the code free software but not make the art/music/etc. free. After all, software freedom is about software, not about art or music.

          People can modify the software, they can use it as they see fit, but they can’t redistribute it along with the art and music. They can either come up with their own art and music or they can redistribute it without the art and music and it’ll be useful to others that already have the game (because they already have the art and music).

          1. 1

            Your model (bottom two free, top not) was exactly what first popped into my mind when I read the first couple of sentences of your comment, so you’re not the only one who thinks it makes sense, fwiw :-)

            By the way (off-topic question), as someone who has recently bought a midi controller (MPK261) and started playing around with some of the synths that I got free (Hybrid Air, Sonivox), and has a decent mathy ability to understand any given synthesis concept, but absolutely no intuition for what changes will sound like… is your game aimed at me? :-)

            1. 2

              Yes, it’s made for exactly you :) I’ll put you on my list for beta testing.

              1. 1

                OMG that’s fantastic!

            2. -1

              why don’t you want to deal LSD

              1. 1

                Beg your pardon?

                1. 3

                  hallucinogenic business models

                  1. 4

                    Ah, right, right. Basically I want to create value honestly and do a reasonable job of recovering revenue from the value I create, rather than playing these games that seem to increasingly substitute for that these days.

                    1. 1

                      what games are you referring to?

                      1. 2

                        Financial engineering in general, more specifically the kinds of things that startups do when they’re looking for an exit or when their purpose is to burn through VC money rather than make a business. MoviePass, Juicero, that kind of thing.

                        1. 1

                          ah okay. in reading your original comment i thought you were saying it’s hallucinogenic to think you could make a living writing free software.

            3. -5

              This would make sense if these ‘attacks on free software’ actually existed, but they don’t. They literally don’t exist.

              Source available is a violation of user freedom. It’s unacceptable. That’s all there is to it, if you care about user freedom. If you don’t then I feel sorry for you.

              FLOSS doesn’t need ‘hundreds of millions in revenue’ to fight any battle because there is no battle. I don’t know what it is, but there seems to be a recurring thread I see on forums a lot recently: everything is framed as a battle. For an unrelated example, if a game developer makes an unpopular change to their game? It’s a WAR to get them to fix it. No it isn’t. I see the term ‘culture war’ thrown around too. There’s no such thing. Not everything in life is a war or a battle.

              People and groups of people that produce non-free software aren’t at war with people that do.

              1. 7

                “This would make sense if these ‘attacks on free software’ actually existed, but they don’t. They literally don’t exist.” “FLOSS doesn’t need ‘hundreds of millions in revenue’ to fight any battle because there is no battle.”

                You missed the war on open source software by Microsoft et al… thwarted largely by IBM saying it will drop a fortune defending Linux which fits my comment… the DMCA attacks, the patent trolling (Android vendors alone pay billions), the copyright ruling from an expensive case that applies to API’s FOSS often depends on, and so on. Hell, a patent defense alone facing one of these big companies gets quoted as about $200,000 on average from expensive, law firms. You bet FOSS folks need a fortune if one of these companies wants to use a legal team to destroy them.

                You really don’t see such things much, though. You wonder why if there’s a threat as I claim. They mostly ignore FOSS developers since they’re (a) free labor that the big companies are monetizing or (b) penniless opponents with weak or non-existent marketing to big spenders. For (b), the common MO is to hit companies building on the product, FOSS or not, for royalties once they’re financially successful. They parasite off them instead of destroy while using a slice of that money investing in lobbyists and courts to ensure they can continue. Alternatively, they use a combo of the lure of money with the threat of market destruction (patent or otherwise) to pressure them to sell the company. Microsoft and IBM have taken entire markets using their dominant positions, patent portfolios, and large[-for-small-player] offerings to get acquisitions. It’s rare for someone to stare at both that kind of money and corporate threat telling them to get lost. So…

                “People and groups of people that produce non-free software aren’t at war with people that do.”

                …the big players wanting to dominate markets, maximize profits from locked-in customers, and eliminate disruptive competition are always at war with folks producing anything that threatens that. Always. It’s never changed since they have people on top whose bonuses depend on this shit. They’ll also remember companies that were sent to non-existence or limbo from new stuff they didn’t stamp out when they had a chance. Just because you or many FOSS folks aren’t playing doesn’t mean they aren’t. They certainly are. They even have people working 24/7 on Capitol Hill to screw people over. Not just here in states: they’re represented in international, treaty negotiations as well under the I.P. agreements. And if you think copyright enforcement isn’t war or this is just U.S., just look at the nice, unarmed, law-abiding people that came after Kim Dotcom over U.S.-generated complaints. Those situations illustrates the heights things can go to when it matters to those in power. Better to be the ones with power.

                1. -5

                  You missed the war on open source software by Microsoft et al…

                  Yes, we get it. This happened. A long time ago, now, but it happened. Okay, move on. It’s no longer relevant. The corporate world has long since embraced free software.

                  the DMCA attacks, the patent trolling (Android vendors alone pay billions), the copyright ruling from an expensive case that applies to API’s FOSS often depends on, and so on.

                  Patent trolling is something very specific. All companies that own patents and enforce them are not automatically patent trolls.

                  Also patent trolls target companies that produce proprietary software just as much as they target companies that produce free software. It has nothing to do with free or proprietary software. They target both, because they target software in general.

                  …the big players wanting to dominate markets, maximize profits from locked-in customers, and eliminate disruptive competition are always at war with folks producing anything that threatens that.

                  No they aren’t.

                  just look at the nice, unarmed, law-abiding people that came after Kim Dotcom

                  If you think Kim Dotcom was law-abiding you have another thing coming mate.

                  1. 7

                    You are very wrong.

                    I am trying to bring some more floss to the public sector and the resistance from the entrenched vendors is a thing.

                    Straight up bribes are manageable, but lobbying and regulatory capture is the true evil. We actually need floss lobbyists of our own.

                    1. 6

                      You are twisting @nickpsecurity’s words and I do not understand why you are doing this.

                      1. 1

                        No I am not.

                  2. 3

                    You’re right that there isn’t a war as such, but there is definitely a certain kind of dynamic. I think @nickpsecurity is also pointing out (quite correctly) the wider implications of the wealth concentration resulting from the free work that goes into producing free software.

                    One of the issues is that the companies that use free software don’t always comply with the terms of the licence. There are many examples:

                    The problems are obvious unless you stick to a very narrow and dogmatic view. Consequences matter. Just like the ill-considered, short-sighted, damn-the-consequences technological “disruption”, free software is an idea that produced a lot of unintended side effects (and in fact, contributed to the aforementioned “disruption”).

                1. 6

                  GNU Autotools: just kill this horrific pile of garbage with fire. Especially terrible when libtool is used. Related: classic PHK rant.

                  CMake: slightly weird language (at least a real language which is miles ahead of autocraptools), bad documentation.

                  Meson: somewhat inflexible (you can’t even set global options like b_lundef conditionally in the script!) but mostly great.

                  GYP: JSON files with conditions as strings?! Are you serious?

                  Gradle: rather slow and heavy, and the structure/API seems pretty complex.

                  Bazel/Buck/Pants (nearly the same thing): huge mega build systems for multiple languages that take over everything, often with little respect for these langauges’ build/package ecosystems. Does anyone outside Googlefacetwitter care about this?

                  Grunt, Rake, many others: good task runners, but they’re not build systems. Do not use them to build.

                  1. 6

                    Related: classic PHK rant.

                    This one is even better since its observations apply to even more FOSS than libtool. It also has some laughable details on that along with the person who wrote libtool apologizing in the comments IIRC.

                    1. 3

                      I recalled that too, but it was David McKenzie of Autoconf who popped up to apologize.

                      1. 1

                        Oh OK. Thanks for the correction. At least one owned up to their mess. :)

                    2. 3

                      FWIW: bazelbuckpants seem to be written for the proprietary software world: a place where people are hesitant to depend on open-source dependencies in general, and people have a real fear (maybe fear is strong, but still) of their dependencies and environment breaking their build. I use them when I’m consulting, because I can be relatively certain that the build will be exactly the same in a year or so and I don’t like having to fix compilation errors in software I wrote a year ago.

                      1. 2

                        I’m with you on Grunt, but Rake is actually a build tool with Make-style rules and recipes for building and rebuilding files when their dependencies change. There’s a case that Rake is just Make ported to Ruby syntax. It’s just more commonly used as a basic task runner.

                        https://ruby.github.io/rake/doc/rakefile_rdoc.html

                        1. 1

                          I think Make is also somewhat close to a task runner. It has dependencies, but not much else. You write compiler invocations manually…

                          1. 1

                            It sort of has default rules for building a number of languages, though these aren’t terribly helpful anymore.

                            I also use Make as task runner. Mostly to execute the actual build system, because everybody knows how to run make and most relevant systems probably have Make installed in one form or another.

                        2. 1

                          We use Pants here at Square, in our Java monorepo. It works quite nicely, actually. For our Go monorepo, we just use standard Go tooling, but I’ve volunteered to convert to Pants if anyone can get everyone to move to a single monorepo. They won’t, because every Rails project has its own repo, and the Rails folks like it that way.

                        1. 10

                          For the same observation but accompanied by investigation instead of ranting, watch George Tankersley’s GopherCon 2017 lightning talk: https://www.youtube.com/watch?v=7y2LhWm04FU&list=PL2ntRZ1ySWBfhRZj3BDOrKdHzoafHsKHU&index=11

                          1. 4

                            To be fair, the article on lemire.me did not seem like ranting. And I’m a plush-gopher-on-my-desk fan of Go :-)

                          1. 2

                            Anyone know how this differs from jsonnet? http://jsonnet.org/

                            1. 1

                              For an organized list of links to videos and slides, see https://github.com/gophercon/2017-talks

                              1. 5

                                I posted a long response as a comment on the blog. The tl;dr is that this is a very naive use of a monorepo.

                                1. 6

                                  Right, you can’t just check in all your code into a monorepo without any tooling or processes and expect things to turn out well. Just like how teams can’t each check their code into individual microrepos without any tooling or processes and expect things to turn out well.

                                  For some reason people always think version control is a silver bullet that can fix complexities of scale.

                                  1. 12

                                    You wouldn’t believe the kinds of things you get to see as a version control consultant. For instance: 50 branches, one per dev, kept open and unmerged for a year “because people kept breakng the build on trunk and got in each others way”. Then it was time for a release. Then they called us in. Not much to salvage…

                                    1. 2

                                      Oh dear.

                                      I had a boss who (the shop using SVN at the time) was amused at my eagerness to go branch off.

                                      “The branching is easy”, he’d said, “Tell me what you think when it’s time to merge.”

                                      git, for all of its faults, is pretty good at that.

                                      SVN is still nice for certain scenarios though that aren’t heavily branched.

                                      1. 2

                                        It’s certainly possible with SVN, I’ve seen it done. Will for quality and discipline can go a long way.

                                        1. 2

                                          My recollection of my experience with svn merge was that the problem with it was just that it was arcane, buggy (e.g. IIRC if you try creating a branch in which there is a file called ‘foo’ and another in which there is a directory called ‘foo’ then misery happens) and super violent because it immediately does whatever idiot thing I asked for (*) on the shared repo instead of in some kind of safe sandbox like a local working copy or local clone of the repo.

                                          (* assuming some non zero fraction of the time I get the wrong time range or something in the merge command on the first attempt)

                                          I have had easy, painless branch merges with SVN. “Will for quality and discipline” had nothing to do with it; “using svn diff and patch to apply and edit patches manually instead” had everything to do with it.

                                          git-svn, despite being a bit evil, was actually really useful because it let you try your merges locally and then examine the result before committing them.

                                          1. 2

                                            (e.g. IIRC if you try creating a branch in which there is a file called ‘foo’ and another in which there is a directory called ‘foo’ then misery happens)

                                            Well, what do other systems do? Merging such nonsense is always a pain especially if it happens at scale of dozens of items, even in git/hg.

                                            SVN might get better at this, eventually. At the moment, this is still being worked on. Subversion 1.10 will better handle cases like this where the node kind remains the same (file vs file, dir vs dir). Your scenario is still out of scope for that release, unfortunately – pay me good money and eventually i might fix that, too. But I have done more than enough work on this complex problem in my spare time while pretty much everyone else on the planet is just complaining from the peanut gallery.

                                            1. 1

                                              But I have done more than enough work on this complex problem in my spare time while pretty much everyone else on the planet is just complaining from the peanut gallery.

                                              I don’t want to discourage or criticise you. I’m relating things that have happened to me personally, not calling your considerable abilities and hard work into question. My honest appraisal at this point is that:

                                              • svn is a very high quality centralised VCS. It is enormously better than the competition that existed at the time it was created (though I’ve never had an opportunity to try p4 for comparison).
                                              • I would still strongly recommend svn over any DVCS for doing version control of large binary files like film or game assets such as footage, meshes and textures.
                                              • DVCS systems are, for source code, just a better model up until you hit scaling limits (which only mega-rich companies like Google, Facebook and Microsoft realistically ever do).
                                              • svn now has way better merging than it used to. svn version 1.6 era merging was so bad that it caused an entire generation of developers to be scared of branching.

                                              Well, what do other systems do? Merging such nonsense is always a pain especially if it happens at scale of dozens of items, even in git/hg.

                                              They certainly don’t succeed at doing a merge when you do this; as you rightly point out there exists no sensible merge for this.

                                              What all the DVCSes I’ve used did was hand me a working tree with both sides written into different files and then ask me to resolve the conflict.

                                              IME svn used to do, upon attempting this merge, put my checkout into a completely broken state, in which none of the usual operations work any more, and which the available documentation didn’t explain very well how to get out of. For example, svn revert used to not work any more in this situation.

                                              OTOH if I did the same with git or darcs, it:

                                              • told me that the merge failed
                                              • put the conflicting files and directories side by side in my working copy, keeping one with the name foo and giving the other a name like foo~HEAD or foo.~0~.
                                              • asked me to resolve this by: deciding what I want to do, editing files to get the working tree into the shape I want to end up with, git adding and then git commiting to complete the merge. Note that git add and git commit are just the ordinary verbs for writing commits in git; this isn’t some kind of unusual weird situation where all the usual tools suddenly stopped working and I have to use completely different verbs from a whole different section of the manual to get out of it.
                                              • if I really decided this was a terrible idea and want to back out, I could run git merge --abort to ditch this whole can of worms and put the repo back to the state it was in before attempting the impossible merge. This works reliably. In a rather atypical feat of decent UX, git even tells me about the possibility of doing this at the time the merge failed.
                                              1. 1

                                                For example, svn revert used to not work any more in this situation.

                                                This must have been many many years ago. Such basic problems have long been fixed, mostly with the working copy redesign and rewrite that happened with Subversion 1.7 (which is old by now, first released in 2011).

                                                I suppose many people saw some buggy behaviour from the SVN 1.5 days, then switched to something else, and still believe that what they have is actual working knowledge of SVN. It’s not, unless you have been using 1.8 or 1.9.

                                                Granted, there are still many implementation bugs being found (here is a fun and recent one). But it’s not anywhere as horrible as it used to be. Thankfully :)

                                                1. 1

                                                  Subversion 1.7

                                                  That didn’t exist when I started using subversion. I mentioned version 1.6 explicitly by name for this reason. ☺ I think, though I’m not sure, that I might even have started out with some version as early as 1.4.

                                                  I remember having to svn upgrade all the working copies when 1.7 came out! There were a few messes where behaviour differed, plus maybe a couple cases where the upgrade didn’t go smoothly for some reason.

                                                  I do not recall svn merge being noticeably more robust with 1.7 than it was with 1.6.

                                                  I think the company I was working for started moving to git in earnest around the time 1.8 was released.

                                            2. 1

                                              We’re talking organisation-wide level here, not about an assessment of how good svn merge is in detail, or for the individual. It’s not great, that’s known.

                                              git-svn can be a tool for solving that, but that still needs adoption in your org.

                                              1. 2

                                                git-svn can be a tool for solving that, but that still needs adoption in your org.

                                                I have no idea what you mean by this. One of git-svn’s historical advantages was that it could be used by a single person without anyone else knowing or needing to know that someone was using it.

                                                Are we just talking about completely different things? Like you’re talking about the need for an organisation to avoid having long-running branches without at least merging trunk into them every time trunk is committed to, and I’m just talking about the much narrower problem that doing so with only svn merge is painful because svn merge is very imperfect?

                                  1. 1

                                    6 years old already? I remember first reading this in 2013. Did we learn anything?

                                    1. 2

                                      It would be interesting to suggest to Norvig that he write a current update…

                                      1. 1

                                        Sorry. I forgot to add the year it was published.

                                      1. 4

                                        I’m bemused – post Snowden – by the commenters that think this is benign or unlikely to be exploited. I’m pretty sure everything theoretically capable of being exploited has a post-it for each manufacturer slowly moving across a Kanban board somewhere inside the NSA.

                                          1. 5

                                            I was at the first day of this sprint. It was really interesting seeing so many engineers excited about mercurial development. Git may have the mindshare but mercurial is definitely not dead.

                                            1. 2

                                              I would like to strongly encourage y'all to do everything possible to make Hg as easy to use as possible. In particular, the floundering discussion on that etherpad about “friendly Hg” was discouraging. Right now git is (a) the defacto winner, and (b) a tire-fire of a codebase and © widely agreed that the porcelain is terrible, but nobody has the ability to change it.

                                              Hg is currently “waiting in the wings” so to speak, and many of us are following closely as Facebook and Google remove obstacles to us switching over to it as a sane monorepo-scale-capable substitute. Please, please, please keep in mind that the potential future users of Hg vastly outnumber current users. While it’s important not to lose community goodwill by churning things and giving people an expectation of version pain, it’s more important to get things right, and to get good defaults. I’ve watched Emacs users struggle with painpoints that 25 years ago weren’t changed “to not hurt existing users"… the future is long! :-)

                                              Also, fantastic work - I am waiting for Hg to reach the point where I can just stand it up, and have it work at company-wide-monorepo scale, and then I look forward to subjecting all my coworkers to the Glorious Monorepo :-)

                                              1. 1

                                                The hg community is quite friendly. If you’d be interested in working on friendlyhg and know a bit of Python you should try sending in some patches.

                                              2. 2

                                                On the contrary, mercurial seems to be the darling on the rise, doesn’t it?

                                                1. 2

                                                  In this particular community, yes. Most of us favor simplicity, for our rather suckless-branded version of that, more than we care about being mainstream, and that means Mercurial has an outsized amount of support here. And Google and Facebook, for obvious reasons, are probably seeing Mercurial on the rise. But in general? Nah. Git won. (For now, at least.)

                                                  1. 2

                                                    I think in business around the world too.

                                                    • Academic is one of big user of Mercurial
                                                    • There are companies migrating from SVN still, and they actually prefer to switch to Mercurial because of CLI UX, and large files support which is far superior to GIT.
                                                  2. 1

                                                    Git probably has less mindshare among the people who care about the design decisions of the VCS they use. While this is a minority niche, it includes (although it is not limited to) most of the people who contribute to VCS development, and Mercurial does well enough in that niche to grow.

                                                  1. 2

                                                    When Facebook first changed their license and patents information on all their projects, they were troublesome. But a lot of people worked to point out the problems, and they very quickly updated them to a new version which was acceptable to most companies.

                                                    1. 1

                                                      I’d be curious to see the graphs if you were actually trying to cool something: like half-fill the cooler with room-temperature bottles of water or cans of soda/beer. Then add (wet/dry) ice, and see what happens.

                                                      1. 4

                                                        Most of the answers seem to be about personal secret management, rather than server secret management.

                                                        I work at Square, so naturally we use KeyWhiz :-) One nice property of KeyWhiz is that the secrets mount with FUSE and look just like files, so in our local dev environments, we can easily set up dev secrets in the same location using real files.

                                                        1. 9

                                                          Whenever I post elm articles I usually I just tag them with haskell.

                                                          1. 4

                                                            Seems like an argument for a separate tag! :-)

                                                            1. 1

                                                              could argue either way. :)

                                                          1. 4

                                                            How can the code “just happen to be owned by Google”?

                                                            1. 8

                                                              Author works at Google and is using his work computer to work on this project?

                                                              1. 2

                                                                He wouldn’t necessarily have to be using his work computer :(

                                                                1. 1

                                                                  Google claims ownership of work done on personal time with personal resources?

                                                                  That’s incredibly shitty of them, if so.

                                                                  1. 10

                                                                    It’s being done on 20% time, from what I understand.

                                                                    1. 4

                                                                      There’s a process to get the company to formally disclaim ownership of things, but then you’re pretty heavily restricted in terms of when you can work on it. If you don’t care about ownership, just getting an OSS license on something is the simpler path by a wide margin.

                                                                      1. 1

                                                                        If it’s useless enough then the process is easy :-)

                                                                      2. 1

                                                                        Shitty, perhaps, but also not uncommon.

                                                                        1. 2

                                                                          Not uncommon, but I normally associate the practice with companies that don’t “get” Open Source, or why devs might pursue side-projects and what their personal IP means for their careers in general.

                                                                          I wouldn’t normally associate those attitudes with Google. And since a lot of developers refuse to sign agreements signing personal IP over to their employer, I’m surprised to hear Google requires it, given how popular they have been among developers as a “good” employer.

                                                                1. 4

                                                                  This zero-copy approach to serialization is similar to capn proto, although the latter includes a nice rpc framework. A related comparison blog entry is here: https://capnproto.org/news/2014-06-17-capnproto-flatbuffers-sbe.html

                                                                  I’d be interested if anyone has tried them as well as protobuf3. Does the new protobuf3 more efficiently transfer large byte arrays? I’d like to use something like gRPC for syncing servers but assume protobuf3 will add too much overhead for my messages (large % is just binary blob data).

                                                                  1. 2

                                                                    Agreed. I would love to see a nice up-to-date comparison of flatbuffers, protobuf3, and capnproto.

                                                                    1. 1

                                                                      proto3 should have roughly the same size as proto2. But you can always compress… although, looking through the grpc site, it’s surprisingly difficult to find a succinct description that includes whether compression is available/on-by-default/not-implemented-yet. Anyone know?

                                                                    1. 22

                                                                      I was hoping for a more cogent critique of React: I think an informed discussion of its weaknesses would be useful. This article failed to deliver: The main criticism, as already mentioned, falls down: there’s nothing to stop you rolling up chunks of DOM into custom Web Components, and then still using React. Other than that, it mostly says React’s design is “bad” and that the event model is nothing new. Hmmm.

                                                                      1. 1

                                                                        Behind this boring-sounding title was a surprisingly good article: no new information, but collected details of just how advanced the NSA’s capabilities are.

                                                                        1. 3

                                                                          I can’t stand articles like this that paraphrase a source without linking to it: http://arxiv.org/abs/1502.03182