1. 5

    What I don’t really understand is how Andrew has a comfortable standard of living in NYC on $600 per/month.


    I’m guessing that there must be another source of Zig donations aside from Patreon?

    1. 7


      1. 2

        Oh woops, I misread the first paragraph, I thought it stated that Zig was supporting him entirely, when it’s actually about his programming supporting him.

        1. 3

          Note that this isn’t his first attempt at doing this. But the project he was working on before Genesis didn’t find the same traction as Zig has. BUT, if I recall correctly, he also didn’t live in NYC the last time… Anyway, he’s got experience with living frugally, so I’m sure he knows what he’s doing here.

          1. 2

            he extrapolated the donations growth versus his savings.

        2. 2

          What I don’t understand is if you are not working in NYC anymore, and only working on your own and getting donation, why doesn’t he move to anywhere but NYC to minimise his personal expense?

          I’m sure there are cities in the US with 80% the fun of NYC at lower than 80% of the cost.

          1. 17

            I work remote, and there are places I could move that are < 20% of the cost.

            My friends aren’t going to move with me, and I have enough money to live where I am. Why be wealthy and lonely?

            1. -10

              Didn’t know your city is the only source of friends in the world. That must be good for the economy.

              1. 32

                I know that this is very hard for some people to believe (seems to be harder the more western the society is), but some people don’t consider their friends a replaceable commodity. Not that I don’t want to make new friends, but these are my friends right now and I am more loyal to them than I am to a meaningless job or to money.

                1. 4

                  Maybe because your partner has a job he/she really enjoys in this city? I mean, we’re lucky in our field to have a lot of different possibilities, in remote or not, mostly well paid. Let’s not forget that it’s a chance and not something everybody has.

              2. 2

                The usual reason is the significant other.

                1. 1

                  There’s a shit-ton of them. Even Memphis TN that’s close to me with all its problems is low cost of living with all kinds of fun stuff to do. Just don’t live in or shop around the hood. Solves most of problems if you don’t have kids going to school or college.

                  There’s plenty of cities in the US that similarly have low cost of living with plenty going on. One can also live in areas 30-40 min from cities to substantially reduce their rent. The fun stuff still isn’t that far away. The slight inconvenience just knocks quite a bit off the price.

                  1. 4

                    I don’t remember the details, and I can’t find the link, but a few years ago someone did some research here in Berlin where they compared the cost of rent in more-or-less the city proper, and the cost of rent + public transportation tickets when you lived in the outskirts. It ended up being not much of a difference.

                    1. 2

                      Well, if you don’t workin in the city and need to commute then you spend even less. Though OTOH, you get tax returns for commutes in Germany so probably the commute is not that expensive to begin with.

                      1. 2

                        Berlin is currently the city with the highest increase in rent world-wide and a few years ago, it was unusually low.

                        Also, Berlin is hard to compare in many aspects, possibly because of a very unique city history.

                1. 2

                  Anyone got any idea how stable (financially) gitlab is?

                  1. 3

                    My research found it’s a YC-backed company that wants to grow up to a 2020 IPO. It took tens of millions in VC funding probably because they were spending more than they were earning. They’re probably not stable specifically because the strategy requires them to adapt and take what chances they need to for that IPO. If they IPO, they’ll probably stabilize a bit focusing on recurring profit.

                    Although their numbers are private, they did have this neat page where they describe goals and results of various positions over time. Might have some educational value for how these jobs work in a fast-moving startup with products like this.

                    1. 2

                      Interesting document, thanks. And I didn’t know they entered the YC circus (in 2015).

                  1. 13

                    I really hate browser notifications. I never click yes ever. It feels like preventing browsers from going down this hole is just yet another hack. The Spammers and the CAPTCHAers are fighting a continuous war, all because of the 2% of people who actually click on SPAM.

                    1. 7

                      I’m amazed there is no “deny all” setting for this

                      1. 5

                        My firefox has that in the settings somewhere:

                        [X] Block new requests asking to allow notifications

                        This will prevent any websites not listed above from requesting permission to send notifications. Blocking notifications may break some website features.

                        help links here: https://support.mozilla.org/en-US/kb/push-notifications-firefox?as=u&utm_source=inproduct

                        1. 2

                          Did anyone find the about:config setting for this, to put in ones user.js? I am aware of dom.webnotifications.enabled, but I don’t want to disable it completely because there are 3 websites which notifications I want.

                          1. 3

                            permissions.default.desktop-notification = 2

                        2. 1

                          there always has been in Chrome and Safari and since very recently, there’s also one in Firefox. It’s the first thing I turn off whenever I configure a new browser. I can’t possibly think of anybody actually actively wanting notifications to be delivered to them.

                          Sure, there’s some web apps like gmail, but even there - I’d rather use a native app for this.

                          1. 3

                            I can’t possibly think of anybody actually actively wanting notifications to be delivered to them.

                            Users of web-based chat software. I primarily use native apps for that, but occasionally I need to use a chat system that I don’t want to bother installing locally. And it’s nice to have a web backup for when the native app breaks. (I’m looking at you, HipChat for Windows.)

                        3. 5

                          There is a default deny option in Chrome, takes a little digging to find though. But I agree that it’s crazy how widespread sites trying to use notification are. There’s like 1 or 2 sites that I actually want them from, but it seems like every single news site and random blog wants to be able to send notifications. And they usually do it immediately upon loading the page, before you’ve even read the article, much less clicked something about wanting to be notified of future posts or something.

                          1. 1

                            The only time I have clicked “yes” for notifications is for forums (Discourse only at this point) that offer notifications of replies and DMs. I don’t see a need for any other websites to need to notify me.

                          1. 4

                            Oh man, the memories. Great video, thanks for sharing – I doubt I would have run across it otherwise.

                            1. 2

                              tl;dw, what is it about? A documentary about the game?

                              1. 6

                                It’s really an overview of adventure games during the late 80s through the 90s all the way to today - with a focus on the Monkey Island games and SCUMM-built games (as well as their competitors). If you love that style of adventure game you’d probably get a lot out of this video.

                                1. 2

                                  Ah, alright! Thanks! I’ll take a look later when I have time.

                                  1. 1

                                    I watched the one about Quake from the same channel. It was nicely done, but a bit long and minor repetitive. Will keep this one for later since I still want to play Monkey Island unspoiled one day :)

                              1. 1

                                Nice. Guess you need quite a bit of traffic before using this becomes necessary.

                                1. 10

                                  gpg encrypted file somewhere. With a simple grep script if I need a password, and a vim plugin to edit gpg files if I need to add/update something.

                                  1. 1

                                    I do something similar but I could not wrap my head around gpg yet: I have a 2GB pendrive that I always mount to /mnt/key and have /mnt/key/ssh /mnt/key/pop3… files encrypted with enchive. It have a --agent flag to make it act as an ssh-agent.

                                  1. 5

                                    Bonus points for using Python to get fields out of a Perl file :)

                                    1. 3

                                      btw, the perl equivalent of the python (kinda):

                                      grep -i apl /usr/share/perl/5.26.2/unicore/Name.pl| \
                                         perl -C -pe'print chr("0x" + (split " ", $_, 2)[0]), " "'

                                      (since you obviously have perl installed already)

                                    1. 3

                                      None. I find the talks usually too slow, and I’m rubbish at the networking part.

                                      Hmmm, maybe those two things are related…

                                      1. 3

                                        The real crisis will come when they decide to rewrite it in a “modern” language. Programming languages are overrated.

                                        1. 4

                                          Do you mean that in the sense that all languages are equal in terms of their ability to solve a problem, so the rewrite is both unnecessary?

                                          I agree that it will be a catastrophe if people decide to just rewrite whole systems, but not because all languages are equal but because rewrites are technically very challenging (recent examples include outages from RBS and TSB when they did big-bang rewrites and deployments of major systems).

                                          1. 1

                                            COBOL is probably a bad language by most criteria that you can come up with. And yet… Has that it mattered that much?

                                            1. 2

                                              Has that it mattered that much?

                                              This question is really hard to answer and probably doesn’t have one. I’ve worked in enough projects that were objective failures but human nature makes us turn it into a success in some way. Maybe if the software was kept more up-to-date my bank wouldn’t have maintenance windows on Sundays where I cannot do any transactions? Or maybe if the software was implemented in a more expressive language it would be less code so easier to maintain. But then you can always argue that when this software was written, COBOL was more-or-less the only game in town so would you rather not have the banking software at all?

                                              So, I don’t know if mattered that much, I don’t know if anyone does. So I don’t think it’s that interest of a statement to make. I do think advising caution if people want to migrate away from it is an interesting discussion to have, though.

                                          2. 1

                                            Do you think runtime environments are also overrated?

                                            1. 1

                                              I have no idea what a “COBOL runtime environment” looks like, to be honest.

                                          1. 5

                                            A few months ago, in a crowded subway car T in Boston, I had my laptop open, working on some hobby C++ code. Someone sat next to me, noticed my code, and exclaimed “Is that C? I didn’t know anyone used C any more”.

                                            1. 4

                                              What did you answer? It’s a tricky question.

                                              1. 3

                                                I think I said “Yes, some people still do” and left it at that.

                                                1. 3

                                                  That was nice of you. Id say, “Well, it depends on what they want to accomplish. What language were you using most?”

                                                  (Mentions one written in C/C++.)

                                                  (Uses that as example to explain why people use it.)

                                                  I like to have just a little fun with those situations.

                                            1. 16


                                              • In 2004 Apple, Mozilla and Opera were becoming increasingly concerned about the W3C’s direction with XHTML, lack of interest in HTML, and apparent disregard for the needs of real-world web developers and created WHATWG as a way to get control over the web standards
                                              • they throw away a whole stack of powerful web technologies (XHTML, XSLT…) whose purpose was to make the web both machine readable and useful to humans
                                              • they invented Live Standards that are a sort of ex-post standards: always evolving documents, unstable by design, designed by their hands-on committee, that no one else can really implement fully, to establish a dynamic oligopoly
                                              • in 2017, Google and Microsoft joined the WHATWG to form a Steering Group for “improving web standards”
                                              • meanwhile the W3C realized that their core business is not to help lobbies spread broken DRM technologies, and started working to a new version of the DOM API.
                                              • in 2018, after months of political negotiations, they proposed to move the working draft to recommendation
                                              • in 2018, Google, Microsoft, Apple and Mozilla felt offended by this lack of lip service.

                                              It’s worth noticing that both these groups have their center in the USA but their decisions affects the whole world.

                                              So we could further summarize that we have two groups, one controlled by USA lobbies and the other controlled by the most powerful companies in the world, fighting for the control of the most important infrastructure of the planet.

                                              Under Trump’s Presidency.

                                              Take this, science fiction! :-D

                                              1. 27

                                                This is somewhat disingenuous. Web browser’s HTML parser needs to be compatible with existing web, but W3C’s HTML4 specification couldn’t be used to build a web-compatible HTML parser, so reverse engineering was required for independent implementation. With WHATWG’s HTML5 specification, for the first time in history, a web-compatible HTML parsing got specified, with its adoption agency algorithm and all. This was a great achievement in standard writing.

                                                Servo is a beneficiary of this work. Servo’s HTML parser was written directly from the specification without any reverse engineering, and it worked! To the contrary to your implication, WHATWG lowered barrier to entry for independent implementation of web. Servo is struggling with CSS because CSS is still ill-specified in the manner of HTML4. For example, only reasonable specification of table layout is an unofficial draft: https://dbaron.org/css/intrinsic/ For a laugh, count the number of times “does not specify” appear in CSS2’s table chapter.

                                                1. 4

                                                  You say Backwards compatibility is necessary, and yet Google managed to get all major sites to adopt AMP in a matter of months. AMP has even stricter validation rules than even XHTML.

                                                  XHTML could have easily been successful, if it hadn’t been torpedoed by the WHATWG.

                                                  1. 15

                                                    That’s nothing to do with the amp technology, but with google providing CDN and preloading (I.e., IMHO abusing their market position)

                                                    1. -1

                                                      abusing their market position

                                                      Who? Google? The web AI champion?

                                                      No… they do no evil… they just want to protect their web!

                                                  2. 2

                                                    Disingenuous? Me? Really? :-D

                                                    Who was in the working group that wrote CSS2 specification?

                                                    I bet a coffee that each of those “does not specify” was the outcome of a political compromise.

                                                    But again, beyond the technical stuffs, don’t you see a huge geopolitical issue?

                                                  3. 15

                                                    This is an interesting interpretation, but I’d call it incorrect.

                                                    • the reason to create whatwg wasn’t about control
                                                    • XHTML had little traction, because of developers
                                                    • html5 (a whatwg standard fwiw) was the first meaningful HTML spec because it actually finally explained how to parse it
                                                    • w3c didn’t “start working on a new Dom”. They copy/backport changes from whatwg hoping to provide stable releases for living standards
                                                    • this has nothing to do with DRM (or EME). These after completely different people!
                                                    • this isn’t about lobby groups, neither is this avout influencing politics in the US or anywhere.

                                                    I’m not speaking on behalf of my function in the w3c working group I’m in, nor for Mozilla. But those positions provided me with the understanding and background information to post this comment.

                                                    1. 8

                                                      XHTML had little traction, because of developers

                                                      I remember that in early 2000s everyone started to write <br/> instead of <br> and it was considered cool and modern. There were 80x15 badges everywhere saying website is in xhtml. My Motorola C380 phone supported wap and some xhtml websites, but not regular html in builtin browser. So I had impression that xhtml was very popular.

                                                      1. 6

                                                        xhtml made testing much easier. For me it changed many tests from using regexps (qr#<title>foo</title>#) to using any old XML parser and XPATH.

                                                        1. 3

                                                          Agreed. Worth noting that, after the html5 parsing algorithm was fully specified and libraries like html5lib became available, it became possible to apply exactly the same approach with html5 parsers outputting a DOM structure and then querying it with xpath expressions.

                                                      2. -1

                                                        This is an interesting interpretation, but I’d call it incorrect.

                                                        You are welcome. But given your arguments, I still stand with my political interpretation.

                                                        the reason to create whatwg wasn’t about control

                                                        I was 24 back then, and my reaction was “What? Why?”.

                                                        My boss commented: “wrong question. You should ask: who?”

                                                        XHTML had little traction, because of developers

                                                        Are you sure?

                                                        I wrote several web site back then using XML, XSLT and XInclude serverside to produce XHTML and CSS.

                                                        It was a great technological stack for distributing contents over the web.

                                                        w3c didn’t “start working on a new Dom”. They copy/backport changes from whatwg hoping to provide stable releases for living standards

                                                        Well, had I wrote a technical document about an alternative DOM for the whole planet, without anyone asking me to, I would be glad if W3C had take my work into account!

                                                        In what other way they can NOT waste WHATWG’s hard work?
                                                        Wel, except saying: “guys, from now on do whatever Google, Apple, Microsoft and few other companies from the Silicon Valley tell you to do”.

                                                        But I do not want to take part for W3C: to me, they lost their technical authority with EME (different group, but same organisation).

                                                        The technical point is that we need stable, well thought, standards. What you call live standard, are… working draft?

                                                        The political point is that no oligopoly should be in condition to dictate the architecture of the web to the world.

                                                        And you know, in a state where strong cryptography is qualified as munitions and is subject to export restrictions.

                                                        I’m not speaking on behalf of my function in the w3c working group I’m in, nor for Mozilla. But those positions provided me with the understanding and background information to post this comment.

                                                        I have no doubt about your good faith.

                                                        But probably your idealism is fooling you.

                                                        As you try to see these facts from a wider perspective, you will see the problem I describe.

                                                      3. 4

                                                        XHTML was fairly clearly a mistake and unworkable in the real world, as shown by how many nominally XHTML sites weren’t, and didn’t validate as XHTML if you forced them to be treated as such. In an ideal world where everyone used tools that always created 100% correct XHTML, maybe it would have worked out, but in this one it didn’t; there are too many people generating too much content in too many sloppy ways for draconian error handling to work well. The whole situation was not helped by the content-type issue, where if you served your ‘XHTML’ as anything other than application/xhtml+xml it wasn’t interpreted as XHTML by browsers (instead it was HTML tag soup). One result was that you could have non-validating ‘XHTML’ that still displayed in browsers because they weren’t interpreting it as XHTML and thus weren’t using strict error handling.

                                                        (This fact is vividly illustrated through syndication feeds and syndication feed handlers. In theory all syndication feed formats are strict and one of them is strongly XML based, so all syndication feeds should validate and you should be able to consume them with a strictly validating parser. In practice plenty of syndication feeds do not validate and anyone who wants to write a widely usable syndication feed parser that people will like cannot insist on strict error handling.)

                                                        1. 2

                                                          there are too many people generating too much content in too many sloppy ways for draconian error handling to work well.

                                                          I do remember this argument was pretty popular back then, but I have never understood why.

                                                          I had no issue in generating xhtml strict pages from user contents. This real world company had a couple handred of customers with pretty various needs (from ecommerce, to online magazines or institutional web sites) and thousands of daily visitors.

                                                          We used XHTML and CSS to distribute highly accessible contents, and we had pretty good results with a prototype based on XLS-FO.

                                                          To me back then the call to real world issues seemed pretestuous. We literally had no issue. The issues I remember were all from IE.

                                                          You are right that many mediocre software were unable to produce proper XHTML. But is this an argument?

                                                          Do not fix the software, let’s break the specifications!

                                                          It seems a little childish!

                                                          XHTML was not perfect, but it was the right direction.

                                                          Look at what we have now instead: unparsable contents, hundreds of incompatible javascript frameworks, subtle bugs, bootstrap everywhere (aka much less creativity) and so on.

                                                          Who gain most from this unstructured complexity?

                                                          The same who now propose the final solution lock-in: web assembly.

                                                          Seeing linux running inside the browser is not funny anymore.

                                                          Going after incompetent developers was not democratization of the web, it was technological populism.

                                                          1. 2

                                                            What is possible does not matter; what matters is what actually happens in the real world. With XHTML, the answer is clear. Quite a lot of people spent years pushing XHTML as the way of the future on the web, enough people listened to them to generate a fair amount of ‘XHTML’, and almost none of it was valid and most of it was not being served as XHTML (which conveniently hid this invalidity).

                                                            Pragmatically, you can still write XHTML today. What you can’t do is force other people to write XHTML. The collective browser world has decided that one of the ways that people can’t force XHTML is by freezing the development of all other HTML standards, so XHTML is the only way forward and desirable new features appear only in XHTML. The philosophical reason for this decision is pretty clear; browsers ultimately serve users, and in the real world users are clearly not well served by a focus on fully valid XHTML only.

                                                            (Users don’t care about validation, they care about seeing web pages, because seeing web pages is their goal. Preventing them from seeing web pages is not serving them well, and draconian XHTML error handling was thus always an unstable situation.)

                                                            That the W3C has stopped developing XHTML and related standards is simply acknowledging this reality. There always have been and always will be a great deal of tag soup web pages and far fewer pages that validate, especially reliably (in XHTML or anything else). Handling these tag soup web pages is the reality of the web.

                                                            (HTML5 is a step forward for handling tag soup because for the first time it standardizes how to handle errors, so that browsers will theoretically be consistent in the face of them. XHTML could never be this step forward because its entire premise was that invalid web pages wouldn’t exist and if they did exist, browsers would refuse to show them.)

                                                            1. 0

                                                              Users don’t care about validation, they care about seeing web pages, because seeing web pages is their goal.

                                                              Users do not care about the quality of concrete because having a home is their goal.
                                                              There will always be incompetent architects, thus let them work their way so that people get what they want.

                                                              Users do not care about car safety because what they want is to move from point A to point B.
                                                              There will always be incompetent manufacturers, thus let them work their way so that people get what they want.

                                                              That’s not how engineering (should) work.

                                                              Was XHTML flawless? No.
                                                              Was it properly understood by the average web developers that most companies like to hire? No.

                                                              Was it possible to improve it? Yes. Was it better tha the current javascript driven mess? Yes!

                                                              The collective browser world has decided…

                                                              Collective browser world? ROTFL!

                                                              There’s a huge number of browsers’ implementors that nobody consulted.

                                                              Among others, in 2004, the most widely used browser, IE, did not join WHATWG.

                                                              Why WHATWG did not used the IE design if the goal was to liberate developers from the burden of well designed tools?

                                                              Why we have faced for years incompatibilities between browsers?

                                                              WHATWG was turned into one of the weapons in a commercial war for the control of the web.

                                                              Microsoft lost such war.

                                                              As always, the winner write the history that everybody know and celebrate.

                                                              But who is old enough to remember the fact, can see the hypocrisy of these manoeuvres pretty well.

                                                              There was no technical reason to throw away XHTML. The reasons were political and economical.

                                                              How can you sell Ads if a tool can easily remove them from the XHTML code? How can you sell API access to data, if a program can easily consume the same XHTML that users consume? How can you lock users, if they can consume the web without a browser? Or with a custom one?

                                                              The WHATWG did not served users’ interests, whatever were the Mozilla’s intentions in 2004.

                                                              They served some businesses at the expense of the users and of all the high quality web companies that didn’t have much issues with XHTML.

                                                              Back then it was possible to disable Javascript without loosing access to the web functionalities.

                                                              Try it now.

                                                              Back then people were exploring the concept of semantic web with the passion people now talk about the last JS framework.

                                                              I remember experiments with web readers for blind people that could never work with the modern js polluted web.

                                                              You are right, W3C abandoned its leadership in the engineering of the web back then.

                                                              But you can’t sell to a web developer bullshit about HTML5.

                                                              Beyond few new elements and a slightly more structured page (that could have been done in XHTML too) all its exciting innovations were… more Javascript.

                                                              Users did not gain anything good from this, just less control over contents, more ads, and a huge security hole worldwide.

                                                              Because, you know, when you run a javascript in Spain that was served to you from a server in the USA, who is responsible for such javascript running on your computer? Under which law?

                                                              Do you really think that such legal issues were not taken into account from the browser vendors that flued this involution of the web?

                                                              I cannot believe they were so incompetent.

                                                              They knew what they were doing, and did it on purpose.

                                                              Not to serve their users. To use those who trusted them.

                                                        2. 0

                                                          The mention of Trump is pure trolling—as you yourself point out, the dispute predates Trump.

                                                          1. 6

                                                            I think it’s more about all of this sounding like a science fiction plot than just taking a jab at the Trump presidency; just a few years ago nobody would have predicted that would have happened. So, no, not pure trolling.

                                                            1. 2

                                                              Fair enough. I’m sorry for the accusation.

                                                              Since the author is critical of Apple/Google/Mozilla here, I took it as a sort of guilt by association attack on them (I don’t mind jabs at Trump), but I see that it probably wasn’t that.

                                                              1. 2

                                                                No problem.

                                                                I didn’t saw such possible interpretation or I wouldn’t have written that line. Sorry.

                                                            2. 3

                                                              After 20 years of Berlusconi and with our current empasse with the Government, no Italian could ever troll an American about his current President.

                                                              It was not my intention in any way.

                                                              As @olivier said, I was pointing to this surreal situation from an international perspective.

                                                              USA control most of internet: most root DNS, the most powerful web companies, the standards of the web and so on.

                                                              Whatever effect Cambridge Analitica had to the election of Trump, it has shown the world that internet is a common infrastructure that we have to control and protect together. Just like we should control the production of oxigen and global warming.

                                                              If Cambridge Analitica was able to manipulate USA elections (by manipulating Americans), what could do Facebook itself in Italy? Or in German?
                                                              Or what could Google do in France?

                                                              The Internet was a DARPA project. We can see it is a military success beyond any expectation.

                                                              I tried to summarize the debacle between W3C and WHATWG with a bit of irony because, in itself, it shows a pretty scary aspect of this infrastructure.

                                                              The fact that a group of companies dares to challenge W3C (that, at least in theory, is an international organisation) is an evidence that they do not feel the need to pretend they are working for everybody.

                                                              They have too much power, to care.

                                                              1. 4

                                                                The last point is the crux of the issue: are technologists willing to do the leg work of decentralizing power?

                                                                Because regular people won’t do this. They don’t care. This, they should have less say in the issue, though still some, as they are deeply affected by it too.

                                                                1. 0

                                                                  No. Most won’t.

                                                                  Technologist are a wide category, that etymologically includes everyone that feel entitled to speak about how to do things.

                                                                  So we have technologists that mislead people to invest in the “blockchain revolution”, technologists that mislead politicians to allow barely tested AI to kill people on the roads, technologists teaching in the Universities that neural networks computations cannot be explained and thus must be trusted as superhuman oracles… and technologists that classify as troll any criticism of mainstream wisdom.

                                                                  My hope is in hackers: all over the world they have a better understanding of their political role.

                                                                2. 2

                                                                  If anyone wonders about Berlusconi, Cracked has a great article on him that had me calling Trump a pale imitation of Berlusconi and his exploits. Well, until Trump got into US Presidency which is a bigger achievement than Berlusconi. He did that somewhat by accident, though. Can’t last 20 years either. I still think Berlusconi has him beat at biggest scumbag of that type.

                                                                  1. 2

                                                                    Yeah, the article is funny, but Berlusconi was not. Not for Italians.

                                                                    His problems with women did not impress much us. But for when it became clear most of them were underage.

                                                                    But the demage he did to our laws and (worse) to our public ethics will last for decades.
                                                                    He did not just changed the law to help himself: he destroyed most legal tools to fight the organized crime and to fight bribes and corruption.
                                                                    Worse he helped a whole generation of younger people like him to be bold about their smartness with law workarounds.

                                                                    I pray for the US and the whole world that Trump is not like him.

                                                            1. 5

                                                              w3c uses github to discuss things?

                                                              1. 3

                                                                Mailing lists, github, meetings, bugzilla.

                                                              1. 2

                                                                I want a nix + openbsd mashup OS so badly, I’m getting more and more paranoid about software, but can’t leave the nix package manager anymore. Excuse my while I go check the code of patch.

                                                                1. 2

                                                                  I wish FreeBSD chose to adopt nix rather than make pkg. For most ports, I can’t imagine mechanically turning them to nix expressions is that hard. Unfortunately, last I played with it the Nix codebase used a lot of linuxisms. I tried getting it to run on FreeBSD but it required more time than I could afford to put into it. I’m not sure why it requires any linuxisms at all since it’s just building software.

                                                                  1. 2

                                                                    it’s just building software.

                                                                    … which is also how we ended up with autoconf.

                                                                  2. 2

                                                                    Maybe Nix + ZFS (or similarly-reliable/manageable) + OpenBSD + RAID and ECC + local and remote backups. That should give extra comfort.

                                                                  1. 6

                                                                    That looks fun! Any particular unexpected NES hardware quirks you found?

                                                                    1. 10

                                                                      Definitely! I think it is particularly interesting to see some of the tricks used to get around with so little resources. For instance, there is only 2kB of CPU ram and 2kB of video ram, with usually ROM on the cartridges. One of the things I found specially clever is the use of “mappers” on the cartridges. Mappers allow memory banks to be switched by the program, so that different memory banks can be accessed on the same memory address by both the cpu and ppu. This allows cartridges to pack more data than would be initially addressable by the console. This is one of the reasons why the NES was able to handle more complex and pretty games. Take for instance Super Mario Bros. 1 (an older game, with a very straightforward cartridge wiring) and Super Mario Bros. 3. They feel like even different platforms, also because smb3 was able to get away with using a more complex mapper.

                                                                    1. 2

                                                                      I find CockroachDB very interesting, but haven’t used it yet. Any good/bad/meh experiences here? Does it really distribute as nicely as they claim?

                                                                      1. 5

                                                                        I’ve only tested it experimentally but with some load. The database behaves very nicely once you distribute it over some nodes. The only downside is that it has some minimum latency (around 100 to 200ms but that was a long while ago) so it won’t match postgresql on that (but it can do some good throughput instead).

                                                                        1. 3

                                                                          was the minimum latency observed in the case of writes? I’d be very curious to the nature of the experimental setup, if you still happen to have the notes from it.

                                                                          1. 2

                                                                            I’m unsure on that but due to the design the latency of reads shouldn’t be as bad as writes (though IIRC from some benchmark numbers, it’s still not low latency). I don’t have the notes anymore, sorry, only tidbits I recall.

                                                                        2. 2

                                                                          Here at LUSH on the Digital team, we are undergoing a global migration to Cockroach, so we can move towards our services being globally available. From what we’re seen, it does distribute rather nicely - but of course has it’s caveats.

                                                                          Bit of a plug, but at some point in the future we’ll likely release some posts on our findings with a mass migration in CRDB. \o/

                                                                        1. 5

                                                                          I’ve been doing some go/sqlite insert performance testing with differnent journaling modes. For grins I’m benchmarking against postgres. I started testing serial inserts (no concurrency), sqlite can easily beat postgres in this case (with the right settings). I set up a simple http server to test concurrent inserts, sqlite suffers in this case as you might expect. I welcome critique of my methodology if anyone wants to take a look:


                                                                          1. 6

                                                                            Increasing the page cache size can help with bigger transactions. And read workloads of course.

                                                                            -- e.g.
                                                                            PRAGMA cache_size = -131072; -- 128mb
                                                                            -- or
                                                                            PRAGMA cache_size = -1048576; -- 1gb

                                                                            If you’re okay with relaxing durability, running with synchronous in NORMAL mode can help with perf.

                                                                            PRAGMA synchronous = NORMAL;

                                                                            It will run a sync on every checkpoint, which by default is every 1000 pages of writes. You can control this on your own by disabling automatic checkpoints and running a separate thread to periodically checkpoint.

                                                                            PRAGMA wal_autocheckpoint=0;
                                                                            -- sync WAL and write back as much data as possible without blocking
                                                                            PRAGMA wal_checkpoint(PASSIVE);
                                                                            -- sync WAL and write back entire WAL even if writers must be blocked
                                                                            PRAGMA wal_checkpoint(TRUNCATE);

                                                                            That’s what I do. Specifically I run a passive checkpoint every second, and switch to running truncate checkpoints every second if the log file exceeds 1gb. The transactions are all pretty small so that doesn’t really happen, it’s there as a failsafe against excessive disk use.

                                                                            Running a checkpoint every second ensures that the WAL will be fsynced every second, which might not otherwise happen in synchronous=NORMAL mode with pagecount-based automatic checkpoints. If you really want to retain every write then you should run synchronous=FULL, but honestly this is a pipe dream for normal use cases. You can’t rely on that guarantee without undertaking significant effort to ensure your hardware actually will actually respect the sync in the case of a power loss. That means battery-backed storage, monitoring the status of your storage controllers and disks, and stopping your application at any sign of trouble. I think it’s easier to block at the application level during important transactions until the change is visible on a geo-redundant replica.

                                                                            More application authors need to be aware that their data isn’t likely to be durable unless they take great pains to ensure it actually is. Loads of people assume SQL database = durable, when in reality 1 second of data loss maximum is probably a lot better than most people have. And for most use cases complete durability isn’t even a valuable feature. If a tweet gets lost, or a match win doesn’t increase your ELO, it doesn’t really matter. You probably have worse bugs. As long as your database is consistent after a power loss, you probably won’t even notice 1 second of data loss.

                                                                            So there’s my soap box rant about why you shouldn’t, bother running with synchronous=FULL if it’s going to hurt your perf.

                                                                            1. 2

                                                                              Thanks for the feedback, for concurrent writes, to start I’m just trying to avoid failures due to the database being locked. So I’m trying to “serialize” writes in the http version with a goroutine. I figured putting all writes in a goroutine (blocking, unbuffered) using a single “write” database connection would ensure all writes are serial, but that doesn’t appear to be the case, as I’m still getting database locked errors.

                                                                              I’m new to go so still trying to wrap my head around whats going on.

                                                                              1. 2

                                                                                You can’t do that with Go, because database/sql does connection pooling no matter what you try to do.

                                                                                You shouldn’t do that anyway, since SQLite in WAL mode is fully capable of handling concurrent writes. You should handle SQLITE_BUSY with a short sleep and a retry. In WAL mode that should only happen doing tx.Commit().

                                                                                1. 1

                                                                                  Actually, I’ve been reading the code for mattn’s sqlite wrapper, and it doesn’t do any connection pooling.

                                                                                  1. 2

                                                                                    I know. But the interface provided by the database/sql package does. Those SQLite bindings only implement the low level Driver interface from database/sql/driver, which gets wrapped and managed by sql.DB.

                                                                                    From the database/sql docs:

                                                                                    DB is a database handle representing a pool of zero or more underlying connections. It’s safe for concurrent use by multiple goroutines. […] The sql package creates and frees connections automatically; it also maintains a free pool of idle connections.

                                                                                    But it looks like it’s no longer impossible to get an individual connection, as of ~7 months ago. Go 1.9 added the DB.Conn method to get an individual connection.

                                                                                    I can’t imagine how a serial writer would get contention, even with a pool, but I wouldn’t be surprised to see it happen. Connection pools tend to have pretty mysterious behavior. If your http version is hitting locks but your ordinary version is not, I believe your write serialization code must be wrong. Your latest code on GitHub doesn’t appear to attempt any write serialization so I can’t currently review.

                                                                            2. 2

                                                                              hah, for fun (as always) I started a pure Go sqlite reader over the weekend (https://github.com/alicebob/sqlittle). I don’t plan to add write support, though.

                                                                              For now it’s all low level routines to read the tabes and indexes. I’ll try to add some very basic SQL support.

                                                                            1. 8

                                                                              Yes! I know many people don’t like it, but I’m really happy about wildcard certificates which will solve one of our problems with securing our services.

                                                                              1. 2

                                                                                Why wouldn’t people like it?

                                                                                1. 3

                                                                                  Mostly because it’s notorioisly hard to get correct. E.g., what level do you allow the wildcard at? Clearly a *.com certificate is a no-no, but what about *.co.il? After all, *.co.com would be valid. And based on that rule, at what level is a wildcard valid for www.foobar.pvt.k12.in.us? What about Google’s new .dev domain? In all these cases, you can have human-made rules, but it gets complicated and error-prone quickly. Mess anything up and you can suddenly generate valid certs for sites you don’t own.

                                                                                  (These issues are similar to but distinct from cookie sharing rules, incidentally, where AFAIK browsers still just ship with massive lists of what’s legal and what isn’t.)

                                                                                  1. 1

                                                                                    (These issues are similar to but distinct from cookie sharing rules, incidentally, where AFAIK browsers still just ship with massive lists of what’s legal and what isn’t.)

                                                                                    jup: https://publicsuffix.org/

                                                                              1. 12

                                                                                His post reminded me how good was to be “alone”*. The most productive and meaningful work I did on my live was when my internet access and other resources where pretty limited.

                                                                                During university I remember wget-ing entire docs sessions on 1.44 floppy to read/study during the weekend because I didn’t had internet access. I’ve also implemented two important projects in a clean room design style, no references other than the provided ones.

                                                                                It’s on my TODO to rent a hut in the woods without internet or cellphone access and take my concentration flow to the next level.

                                                                                *As Alone I mean most about being offline and not accessible.

                                                                                1. 4

                                                                                  My favorite development time is spent on buses, where I get a few hours of “leave me the fuck alone” and “internet connection too poor for anything but IRC and documentation lookup”.

                                                                                  I kinda want to do a long train ride in the US for similar purposes.

                                                                                  1. 5

                                                                                    My favorite development time is biking trough town and extended forest walks. I think I fix most bugs offline.

                                                                                    1. 3

                                                                                      That reminds of the Amtrak writer’s retreat (EDIT - I guess it was called a “residency”) that they ran a while back: http://blog.amtrak.com/amtrak-residency/

                                                                                      1. 2

                                                                                        You can get that benefit in rural areas, too, if you dont bring a smartphone with you. People often discuss drawbacks of being isolated from jobs, few crowds, good Internet, etc. When you need to focus or relax, stuff being far away can make that easy.

                                                                                        1. 1

                                                                                          My experience wasn’t all that great for writing as I found it difficult to type without mistakes. Other than that, it was not a bad trip.

                                                                                      1. 5

                                                                                        Back to looking at Elm alternatives (I got distracted last week. Purescript looks promising).

                                                                                        Slowly start looking for a proper job again. https://angel.co I find useful to find interesting companies. Any other tips for that?

                                                                                        1. 1

                                                                                          Dont Forget to look at the Lobsters Linkedin group ;) You might find nice job descriptions there too!

                                                                                          1. 1

                                                                                            Linkedin? Lobsters? I’ll go search.

                                                                                            1. 2

                                                                                              https://www.linkedin.com/groups/8646069/jobs ;) Just ask to join if you’re not in the group!

                                                                                              1. 1

                                                                                                I signed up a few days ago, but the status is still ‘pending’.

                                                                                                1. 2

                                                                                                  Sorry for the delay, you should now have access :)

                                                                                                  1. 1

                                                                                                    got it, thanks!

                                                                                        1. 2

                                                                                          So that’s a 50 million donation from a co-founcter of WhatsApp? Neat.