1. 7

    Now that we’ve passed $1K - can we beat the $5K?

    1. 5

      If we’re trying to shoot for 5k we should at least let Maine know.

      1. 5

        As a European, I honestly prefer us to have it :)

        1. 4

          I’d say we go for it, and offer them to take over the gold spot for a 10k donation to Unicode instead. :-)

      1. 0

        A lightning-to-headphone jack costs $9. I can’t see how that will stop someone venturing to make their own ECG monitor.

        1. 3

          You have to ensure the software supports that.

          If you plug that into an iphone with a 3.5mm jack you’ll get an error saying you can’t use it. Given the general feel of this post, and the way Apple has deprecated things in the past, it’s not unfathomable that future iterations of iOS might not allow the use of the lightning to headphone connector.

          1. 1

            If you plug that into an iphone with a 3.5mm jack you’ll get an error saying you can’t use it

            That’s not true. It works on various iPads and on an iPhone 6s, all of which predating the existence of the adapter. The only requirement is iOS 10 or later which, I think, runs on all lightning port equipped devices.

        1. 2

          It sounds to me like they are deprecating all server services and probably preparing to merge macOS Server into macOS so they’ll have just one computer OS. Am I missing anything?

          1. 5

            Ever since Lion they stopped having a macOS Server version. That’s when the server app appeared the first time which was installing what previously was part of the server OS.

            Over time they removed more and more features from that though, so now all that’s left is OpenDirectory (their LDAP/Kerberos AD equivalent) and their MDM solution.

            1. 2

              It now makes sense why it seems such a dramatic change for me as the last time I’ve worked with macOS Server it was back during the Tiger days. Thank you for clearing things up!

          1. 7

            I‘m not convinced that the current trend to put authentication info in local storage is entirely driven by the thought of being able to bypass the EU cookie banner thing. I think it‘s more related to the fact that a lot of people are jumping on the JWT bandwagon and that you need to send that JWT over an Authorization header rather than the cookie header.

            Also, often, the domain serving the API isn‘t the domain the user connects to (nor even a single service in many cases), so you might not even have access to a cookie to send to the API.

            However, I totally agree with the article that storing security sensitive things in local storage is a very bad idea and that httponly cookies would be a better idea. But current architecture best-practice (stateless JWT tokens, microservices across domains) make them impractical.

            1. 4

              Hey! You are correct in that this isn’t the main reason people are doing this – but I’ve spoken to numerous people who are doing this as a workaround because of the legislation which is why I wrote the article =/

              I think one way of solving the issue you mention (cross-domain style stuff) is to use redirect based cookie auth. I’ve recently put together a talk which covers this in more details, but have yet to write up a proper article about it. It’s on my todo list: https://speakerdeck.com/rdegges/jwts-suck-and-are-stupid

              1. 2

                Ha! I absolutely agree with that slide deck of yours. It’s very hard to convince people though.

                One more for your list: having JWTs valid for a relatively short amount of time but also provide a way to refresh them (like what you’d do with an oauth refresh token) is tricky to do and practically requires a blacklist on the server, reintroducing state and defeating the one single advantage of JWTs (their statelessnes, though of course you can have that with cookies too)

                JWTs to me feel like an overarchitectured solution to an already solved problem.

                1. 1

                  There’s a third use case: services that are behind an authentication gateway (like Kong) and whenever a user is doing an authenticated request then the JWT is injected by the gateway into the request headers and passed forward to the corresponding service.

                  But yes, a lot of people are using $TECHNOLOGY just because it’s the latest trend and discard “older” approaches just because they are no longer new which is quite interesting because we today see a resurgence of functional languages which are quite old, but I digress.

                2. 2

                  you need to send that JWT over an Authorization header rather than the cookie header.

                  Well, you don’t need to, but many systems require you to. It’s completely possible — although it breaks certain HTTP expectations — to use cookies for auth² is after all quite an old technique.

                  1. 1

                    This is true – you could definitely store it in a cookie – but there’s basically no incentive to do so. EG: Instead just use a cryptographically signed session ID and get the same benefits with less overhead.

                    The other issue w/ storing JWTs in cookies is that cookies are limited to 4kb of data, and JWTs often exceed that by their stateless nature (trying to shove as much data into the token as possible to remove state).

                  2. 1

                    Could you point me to some sort of explanation of why using localStorage is bad for security? Last time I looked at it, it seemed that there was no clear advantage to cookie based storage: http://blog.portswigger.net/2016/05/web-storage-lesser-evil-for-session.html

                    1. 2

                      Just as the article says: if you mark the session cookie as http only, then an XSS vulnerability will not allow the token to be exfiltrated by injected script code.

                      1. 1

                        Are we reading the same article? What I see is:

                        • “The HttpOnly flag is an almost useless XSS mitigation.”
                        • “[Web storage] conveys a huge security benefit, because it means the session tokens don’t act as an ambient authority”
                        • “This post is intended to argue that Web Storage is often a viable and secure alternative to cookies”

                        Anyway, I was just wondering if you have another source with a different conclusion, but if not, it’s OK.

                        1. 3

                          I disagree with the author of that article linked above. I’m currently typing out a full article to explain in more depth – far too long for comments.

                          The gist of it is: HttpOnly works fine at preventing XSS. The risk of storing session data in a cookie is far less than storing it in local storage. The attack surface is greater there. There are a number of smaller reasons as well.

                          1. 1

                            Great, I would appreciate a link (or a Lobsters submission) when you’ve written it.

                  1. 2

                    What do we learn from this?

                    READ THE DOCUMENTATION!!!

                    In a case like this where the framework is practically priming a landmine for you to step on, I would say you’d rather fix the framework than read the docs. If your ORM has, for all intents and purposes, completely broken transactions, you’re not allowed to hide behind the docs.

                    1. 1

                      If you order by an expression like this, the database will have to sequence scan the whole table in order for OFFSET to work. Maybe add a functional index covering the expression - that will remove the need for the sequence scan.

                      1. 31

                        The “identify language by location” issue is worse than anything account related honestly - I don’t use a google account, but if someone links me to a gdoc or google groups or something - I get Thai “in-page chrome” - all buttons, navigation etc is in a language I can’t read and barely speak.

                        Why? Because google has apparently decided that the Accept-Language header is a filthy communist plot and can’t be trusted.

                        This is more of an issue than the one described in the article because there is no obvious way on any google pages to change language. The author does understand the language shown to them it’s just not the one they wanted.

                        1. 10

                          Why? Because google has apparently decided that the Accept-Language header is a filthy communist plot and can’t be trusted.

                          you really couldn’t in the mid-90ies when the internet started taking off (it was normally set to the language spoken by the people who made the browser).

                          Unlike other things from the 90ies (like websafe colors), this one seems to stick around, possibly because only relatively few people are affected by it and because it doesn’t impact how the site looks on the CEOs machine (unlike websafe colors).

                          By now Accept-Language is very reliable and should be treated as a strong signal. Much stronger than geolocation or other pieces of magic (which also regularly fail spectacularly in multi-lingual countries. Damn French Youtube ads here in Switzerland)

                          1. 3

                            I mean - ok but that’s 25 years ago. You couldn’t stream video in the mid 90s, you couldn’t browse on a mobile device, shit you couldn’t do most of what people do now on the web in the 90s.

                            Even then it sounds like a bad idea - false positives where the users language isn’t that of accept-language means that they’re already managing to use a browser in a language other than their own.

                            Ultimately there are numerous better options that allow for potential mismatch of the browser language - but google uses none of them. They just base it on IP country code and fuck anyone who’s disadvantaged by it.

                            1. 0

                              You could definitely stream video for most of the 90s… I certainly did.

                            2. 2

                              Funny that you should mention French ads on YouTube in Switzerland, no later than yesterday I was complaining on IRC that I only get Swiss German ads on YouTube and Spotify although I’ve been living in the French speaking part of Switzerland forever. It’s even more surprising because I don’t speak German or Swiss German and Google absolutely certainly knows this about me, the locale sent by my browser is fr-CH, etc.

                            3. 10

                              And what a brain-dead decision not to make switching language the easiest possible UI interaction. At least some sites make it easy e.g. by listing the desired language in that language because guess what - I don’t know what the word English is in Arabic.

                              Not to mention the fact that there are MULTIPLE valid languages for many locations.
                              Worse still that so many other sites have blindly copied the idea. God help you if you travel or spend significant time where you don’t understand the local language and don’t want to be forced to log in (or even have an account).

                              I wrote a greasemonkey script to add a hl=en slug across all google locations and I dread the day they decide not to respect that.

                            1. 27

                              I appreciate the detailed response and very thankful for the code changes regarding how thread deletion works. I do believe that having the ability to remove pin-pointed comments without whole threads is a better mechanism.

                              I am concerned how and where moderation is going. I asked yesterday on IRC:

                              21:16 < mulander> pushcx: is this your site or do we curate content as a community?
                              21:16 < mulander> if it's your site and your content than I have nothing more to say than to step down
                              21:16 < mulander> and just shut up.
                              21:17 < mulander> if we are moderating as a community then sometimes the moderation has to accept that a large portion of the userbase wants some content present
                              

                              to which you replied:

                              21:17 < pushcx> mulander: It's both.
                              

                              I don’t think ‘both’ works. With the last mod action we as a community exercised moderation transparency and I think that worked out fine as it did many times before. I do believe a site needs moderation but it should always be the last resort and I hope this site will not end up with yet another pamphlet that I will have to cross reference before posting an article or a comment.

                              1. 16

                                I do believe a site needs moderation but it should always be the last resort and I hope this site will not end up with yet another pamphlet that I will have to cross reference before posting an article or a comment.

                                “Moderation as a last resort” is - in my opinion - something that comes from a very narrow view of what moderation is. Moderation that only applies at the very last moment is bad moderation. It’s an ongoing process, which only rarely shows in technical moderator action like deleting complete or parts of posts.

                                It can definitely be both. In the end, pushcx is the person willing to keep the lights on and to deal with all ramifications of running this site. I’m fine with that person being the final instance. That can definitely be both - there’s interactions between the person running the site and those willing to support, and in that interaction lies a lot of potential.

                                1. 7

                                  “Moderation as a last resort” doesn’t preclude moderator intervention, it’s about having a gradient for how problem behavior is dealt with. I’ve been able to keep the number of people I had to ban in #haskell-beginners on Freenode IRC very low (only a couple occurrences) despite building a constructive, focused, and friendly community because I was present and responded to problems in ways other than banning people. It’s now one of the more peaceful and helpful IRC channels I’ve ever had the pleasure of participating in and it’s not so because I resorted to the banhammer every-time someone said something I felt was out of step with my goals for the community.

                                  1. 3

                                    Sure. The whole point of my post is that moderation is so much more then wielding the banhammer. “Moderation as a last resort” excludes that viewpoint, though, by focusing on only the technical details.

                                2. 17

                                  Thank you sharing your concerns. And for bringing up this conversation, because I think there’s more worth talking about.

                                  I said “It’s both” because neither perspective can give a full understanding of Lobsters alone (or of any social site). I especially think it’s not only the first because no one can exercise dictatorial power over a voluntary community, everyone is free to leave at any time. And people have left in Lobsters’ history over the near-absence of moderation, mostly quietly. But there’s some maintenance and moderation that need a single person. I’ve been debugging log rotation and backups for the last week, I can’t post root’s ssh key for the community to take over the task. And less frequently, that means deliberately setting norms and expectations for the community, like that we need to prefer substantive counterpoints to dismissive repartee.

                                  I think aside from the confusing moderation log message I wrote, the biggest problem with that action was that it was unpredictable. Predictability is vital for effective moderation, and it’s clear from the reaction that I wasn’t. I’ve been working on updating the about page that has not seen much updating since Lobsters was much, much smaller and being vague was a virtue that led to the community experimenting and defining itself. I haven’t rushed to finish because I’m trying to take my time and think hard about what Lobsters has grown into the last five and a half years, and the challenges it faces as it continues to grow, rather than slap something up. Which… is what I did with a thoughtless moderation log entry.

                                  I agree and disagree with the idea that moderation should be a last resort. In a large community the little nudges I wrote about in this post are useful for avoiding Eternal September. But the large actions like banning users absolutely need to be a last resort after every other option has failed.

                                  And I share your fear of a site where opinions are so constrained that there might as well be a list of orthodox positions. We’ve had some wonderful conversations where people have disagreed about technical and even political topics, which is rare and valuable on the web. I hope to preserve and expand our ability to have those conversations. A pamphlet of required opinions would be an abject failure.

                                  1. 17

                                    And less frequently, that means deliberately setting norms and expectations for the community, like that we need to prefer substantive counterpoints to dismissive repartee.

                                    With the utmost respect for the work you put in and your contributions to the site, please do not use your moderator powers to try and shape the norms of the community.

                                    1. 11

                                      Could you point to communities where you think this has worked out particularly well? In the last few weeks I’ve been analyzing all the communities I’ve participated in or read substantially, and they’ve either been so small that everyone knows everyone or have moderators doing exactly this. (And it’s certainly been the case on the ones I’ve moderated.) But this could totally be personal style of what I think is a healthy community causing selection bias, and I’d love to break out of it if that’s the case.

                                      1. 12

                                        I could, but you might not agree with what I believe to be successful communities. That is why I ask that you don’t use your moderator powers to try and shape the community.

                                        You are already very influential within the community, so I feel that a lead-by-example approach is more appropriate and effective than trying to tailor the content of the site to what you feel the community wants.

                                        1. 15

                                          So, let’s just state an obvious examples of a “successful” community with low moderation: 4chan.

                                          A large part of Internet shibboleths come directly from there. A lot of really influential art and music and even code originated there.

                                          But, is it really worth it to wade through all of the garbage every day? Is it worth skipping every third post from some /pol/ idiot blathering on about white-supremacy? Do you really want to see yet another low-quality bait post about intel and AMD?

                                          Similarly, HN used to be fairly lightly moderated (and the current system has its flaws, Lord knows!). But a lot of just plain spam and garbage became cultural norms there: product posts, trivial tech news, politics, whatever.

                                          When reflecting on this problem, my advice would be: don’t only consider what moderation would censor, consider what no moderation would allow.

                                        2. 5

                                          https://lobste.rs/s/kmpizq/deleting_comments_instead_threads#c_uh4my1

                                          I successfully shaped the norms of #haskell-beginners without wielding a banhammer. Only a couple problem people have had to get banned in the channel since it started in May 2014.

                                          1. 4

                                            One thing to consider is the different audience sizes between a Haskell beginners IRC channel and a general technology link sharing website. What works for one might very well not work for the other.

                                            We’ve witnessed time and time again how the quality of gathering places went to shit as the audience size increased and TBH, I would love to find a place where this doesn’t eventually happen. If heavy moderation is a way to get there (these various other places always felt like places where only little to no moderation was happening), so be it.

                                        3. 1

                                          With the utmost respect for the work you put in and your contributions to the site, please do not use your moderator powers to try and shape the norms of the community.

                                          I don’t see where that was implied.

                                          1. 2

                                            I stated it pretty explicitly:

                                            And less frequently, that means deliberately setting norms and expectations for the community, like that we need to prefer substantive counterpoints to dismissive repartee.

                                            1. 3

                                              I was more at miss with the “using moderator powers” part. There’s so many more ways to shape norms.

                                              1. 4

                                                I could’ve been clearer there. I was specifically referring to shaping the norms of the community by means that are not available to every other user - such as removing content.

                                      2. 4

                                        I hope this site will not end up with yet another pamphlet that I will have to cross reference before posting an article or a comment.

                                        I seriously doubt that will happen, and agree that I don’t want it to happen.

                                        And I think ‘both’ does work.

                                      1. 20

                                        Just like IBM and their clients have benefitted from Lenovo.

                                        have the customers benefitted? The Thinkpads got worse and worse the more time has progressed after the sale. The old IBM built machines were absolutely top-class in build-quality and absence of bloatware (minus some IBM-built updaters which were bad but could be removed).

                                        As time has progressed, the Thinkpads have become Notebook like all other PC Notebooks. Nothing special, some with better build quality, some with worse, but definitely no longer top class.

                                        Yes. The machines got cheaper, but sometimes, cheaper isn’t the be-all, end-all. For some types of usage, build-quality matters much more than price.

                                        I really hope that Apple can keep supporting the Mac for a very long time as I still love their hardware very much, exclusively future-looking ports be dammed.

                                        1. 5

                                          My first reaction was very similar. While Lenovo kept the quality of Thinkpads high for a few years following the sale, they’ve now become (as you said) very much average. There may be cases in which this type of thing worked, but Lenovo + IBM is not one of those situations.

                                          1. 4

                                            Yeah, I used to say that thinkpads were the only laptops worth buying, and now I feel none are worth buying. There probably won’t be a laptop I’m excited about for a very long time.

                                            Hopefully someone raises tons of money and is able to make laptops targeting developers that isn’t shy about charging a ton of money for a top class product for people who use their machines for work all day.

                                            1. 2

                                              Not just the quality of the products, but also the ethical standards and practices of the firm that makes them.

                                              IBM was (and is) a leader in the practice of ethically sourcing minerals, Lenovo has continually purchased conflict minerals from African kleptocracies and genocidal terrorists, particularly in the Kivu province and the Congo at large.

                                              If you are buying a new machine this holiday season, please check out http://zv.github.io/buyers-guide.html

                                            1. 5

                                              Those concerned about their privacy might be alarmed by the arrival of such badges

                                              might?

                                              This article is easily the biggest WTF all month.

                                              The microphones are there. There’s zero accountability of what’s actually recorded and what isn’t. Just because the company/boss/whoever says that no actual conversations are recorded means nothing. The capability is there. It will be used.

                                              1. 6

                                                Yes. It will be used unless either the law or aggressive social disapproval prevent it. We can fantasize, but I wouldn’t count on the latter.

                                                This is also going to literally kill people. In the US, it’ll probably be regulated out of existence. In the developing world, we’re going to see this used for unionbusting and, in a lot of these countries, that’s done with bullets rather than blacklists and bad references.

                                                If your product relies on corporations being ethical, the only ethical thing to do is to not launch it.

                                                1. 2

                                                  You mean in Europe maybe. In the U.S., there’s all sorts of monitoring by employers with either no or few regulations going way back. I’m not hopeful on this one.

                                              1. 6

                                                Before we bring out the pitchforks, we have to answer a few questions:

                                                1. is that password dialog really a dialog painted by dropbox? Or is this the normal sudo dialog that’s used to authorise Dropbox to install a privileged helper (which is the official method)?

                                                2. if it’s a dropbox dialog, the question remains whether they actually store that password or not. If they just use it once to install a privileged helper it’s ok-ish even though they could just use the official API.

                                                3. Why is that SQLite file that defines the accessibility settings not in a location protected by system integrity protection? While it’s certainly not nice to hack yourself into that file, there is a technical means to prevent this which Apple has not used

                                                As I have not much experience in OSX programming, I can’t say anything about 1), but I did a few experiments with regards to 2): If you remove the privileged helpers installed by dropbox, then it will ask for a password again instead of just installing them. This suggests to me that they’re not storing your password somewhere.

                                                With regards to the helpers that are installed as SUID. One is called dbfseventsd. I haven’t reverse-engineered it beyond looking at its name, but I would say that this is a component to learn about changes to watched folders and as such this pretty much feels like a component needed for Dropbox “to work correctly”, so I would not say that the claim in the sudo dialog is all made-up.

                                                However, the issue remains that dropbox asks for a lot of privileges even for stuff a user doesn’t need. With regards to the accessibility access, I would say that’s needed in order for the hacked integration into MS-Office to work - and maybe for the automated screenshot upload.

                                                It would be preferrable if this was optional, but I really think that their hack to get themselves registered provides superior UI compared to what, say, Steam does which will have the user manually open System Preferences and add hand out permissions there.

                                                So from an end-user perspective I can see why they are taking this path and once Apple closes that loophole (see 3. above), then the other vocal group will come out of the woodwork and yell about Apple taking away their freedoms to tinker with their systems.

                                                Dammed if you do, dammed if you don’t.

                                                1. 3

                                                  If someone is a user of dropbox (I am not), then I think pitchforks are semi-warranted.

                                                  Why is that SQLite file that defines the accessibility settings not in a location protected by system integrity protection? While it’s certainly not nice to hack yourself into that file, there is a technical means to prevent this which Apple has not used.

                                                  Not sure what you are trying to argue here. Should that file be protected, even from sudo/root (which dropbox asks for)? It seems the answer is “clearly”. Does the fact that it is not make it ok to directly modify it? I posit that it does not.

                                                  Dropping suid binaries seems a bit fishy too, but without knowing what that file does, I will reserve my judgement on it.

                                                  1. 2

                                                    I don’t know any details personally, but https://news.ycombinator.com/item?id=12464730

                                                    1. 1

                                                      So it’s an OS X dialog? That changes my position slightly.

                                                    2. 1

                                                      Impersonating an OS dialog is shady business. Full stop. Either let the OS present the dialog, or present something that looks like your own dialog.

                                                    1. 6

                                                      Right then. All of us Mac users need to find a new BitTorrent client :)

                                                      (I don’t personally care THAT much - I use BT once in a blue moon to download Linux distros :)

                                                      1. 2

                                                        Or, perhaps better, get involved and help out with the release process so that it becomes more secure.

                                                        I read on HN that in-app upgrades were not affected, only their website. If that’s true, sounds like better security surrounding their web / release process there is needed.

                                                        1. 2

                                                          Why does this keep happening? Are they storing the key on the web server, then getting hacked?

                                                          1. 4

                                                            Last time this happened, there was never a proper post-mortem and the site was never brought offline. I don’t think they know how the previous compromise happened, nor that they did rebuild the compromised system.

                                                            If I had to guess, I would say that this second compromise is just the first attackers using their previously established foothold

                                                        2. 1

                                                          Unfortunately Transmission is just dead stupid simple, and I don’t want anything else.

                                                          Who wants to build a new bittorrent client?

                                                          1. 1

                                                            You could create a fork that is just a vetted mirror of the repo, every commit would be reviewed by you or a team before being merged, you would have to build and supply your own releases from that source code though. There may be a project that does exactly this for transmission for mac already? For linux distros it is a bit simpler as you could probably rely on the distro doing the merging, reviewing and building for you?

                                                          2. 1

                                                            Not a solution everyone, but: compile from source and run the open source client Deluge. I use Deluge on my RPi, which is my always-on NAS which also happens to support Torrenting thanks to Deluge :)

                                                          1. 9

                                                            As always their writers are killing it. This is how you do release notes.

                                                            The dolphin project is a shining example to all of the Free Software community in how to handle communication with your users and how to handle release building and management.

                                                            We need projects like this to bring newcomers in and have them feel welcome. A huge thank you to anyone involved and congratulations to an amazing release.

                                                            1. 4

                                                              Keep in mind though that soon you will need a reasonably new OpenSSL if you want Chrome to actually use http2: Chrome is deprecating NPN in favor of ALPN which requires OpenSSL 1.0.2. If you are using Debian 8 for example, just bumping nginx won’t be enough. You would have to link it against a newer OpenSSL (or libressl which is what we have done for our custom packages)

                                                              1. 3

                                                                LibreSSL also has newer crypto like Chacha/Poly which Chrome in particular seems to like so much.

                                                              1. 6

                                                                What an incredible piece of art. So much dedication and such a nice outcome. Brings back memories to when the original innards of this box were actually state of the art.

                                                                It’s Not just the floppy, but also the machine in general. Though, I have to say, the floppy does take the cake.

                                                                1. 13

                                                                  Dolphin is one of the most impressive free software projects out there. They get so much right. From programming (just look at this video) to release engineering (their build infrastructure is top notch), to marketing and communication (their monthly progress reports are very well written for both technical and non technical audiences).

                                                                  And to top it all off, it’s one of the more inclusive projects where women aren’t just welcome, but actually celebrated for their skill (have a look at https://dolphin-emu.org/blog/2014/08/31/dolphin-progress-report-august-2014/ for example).

                                                                  Every time I’m seeing news about Dolphin I’m nothing but impressed with the project.

                                                                  1. 1

                                                                    Well. I would argue that letting her work on a machine is safer than installing, say, TrendMicro

                                                                    (SCNR)

                                                                    1. 7

                                                                      Why are people so against fast-forward commits? I don’t really get this, and seeing all of these merges with ancestors on the same branch seems confusing to me.

                                                                      1. 3

                                                                        It makes it hard to back out a logical changeset if the history is just one long set of changes. You can, of course, look carefully to find where to delimit a changeset, but why not let the history indicate it for you?

                                                                        1. 2

                                                                          This is why I personally decide on a case-by-case basis. If it’s a big new feature (still consisting of nicely cleaned up self-contained commits of course), I will merge it with no-ff, but if it’s general bug fixing or smaller new changes, then I prefer fast forward whenever possible

                                                                          1. 2

                                                                            Yeah, this tends to be my way of it. If it’s a thing that I could conceivably have developed on master anyway, then it’s probably not worth cluttering the history with the merge commit.

                                                                          2. 1

                                                                            Okay, so the oedipus merges are delimiters in history, right? Why not just use empty commits instead as delimiters? What are you gaining by non-linearising your history?

                                                                            1. 1

                                                                              Because .. you still have to backtrack manually? Your graph shows the lines wrong; the branch’s history should be on the right, whereas the mainline can be stepped through easily by going up a single merge commit at a time (master^, master^^, master^^^). An empty commit stores no metadata about what was merged; a merge commit does. Seems straightforward?

                                                                              (And it’s still effectively linear, there’s just extra metadata. If you want a purely linear view without jumps just step through mergecommit^2 every time you hit one.)

                                                                          3. 3

                                                                            It destroys working product. Every merge has the potential to introduce errors. You start with a working X. Now you forward it and do a merge, introducing a bug. Later the bug is found and you want to go back to a working version. But you can’t. The pre merge version of X no longer exists. Now you’re stuck trying to find a merge bug hiding in the commit adding the feature. The forward ported version of X was not how it was originally developed, nor tested. It never existed in the wild.

                                                                            1. 3

                                                                              One reason I love changeset evolution in Hg: while the main history can be linear, the full history of a given patch is also preserved (at least locally, and optionally server-side) for this use case.

                                                                              1. 1

                                                                                Oh, awesome. I’ve been thinking about this recently. What I wanted is “shadow” branches that are usually invisible, but discoverable when necessary. These would contain the original development work, but when a merge is performed the diffs of the branch would be reapplied on top of the other head, creating the all important pristine linear history. This sounds like exactly that.

                                                                                1. 3

                                                                                  If you want to play with this, you’ll probably want to grab the evolve extension (which hasn’t yet been merged into Mercurial core), and will also likely want to enable histedit (which is built-in). That’s it; you don’t need anything else. (In fact, evolve is the only third-party extension I have; the rest are the core ones that amount to conveniences. You can see my hgrc if it’s helpful.) Note that evolve is even used by the core Mercurial team, so if you enable it and grab the main repo, you can actually see/play with obsmarkers to get a feel for how this all works.

                                                                              2. 1

                                                                                The pre merge version of X no longer exists.

                                                                                What? We’re talking about fast-forward merges vs oedipus merges. Neither of them changes anything in the working directory. The only difference is whether you want an extra commit that does a no-op merge, or not. Or if you want the oedipus merges as delimiters, why not use empty commits as delimiters instead, and keep your DAG linear?

                                                                                The forward ported version of X

                                                                                I surmise that you’re using the “official” git terminology for “rebase”. I’m no talking about rebasing here. I’m just talking about, once you have rebased (and tested and made sure your rebased version works and everything), what are you gaining by having an oedipus merge?

                                                                                1. 4

                                                                                  This obviously requires a car analogy.

                                                                                  On your branch, you add a red spoiler to your red Camaro. On the main branch, somebody repaints the car black with flames. You merge your work which applies cleanly because you changed different things, but which also results in a hideous mess, so you probably add another commit to repaint the spoiler with flames. So far, so good.

                                                                                  Your boss calls you into the office. “It’s a bit much, don’t you think? A spoiler and flames?” You explain how totally sweet it looked with just the red spoiler added. “Show me.” uh oh…

                                                                                  There is no command you can run to recreate a red car with a red spoiler. It does not exist at any point in the repository because all of your pre-merge work is now post-merge, after the repaint. The only way to get it back is to manually take your commits, cherry pick them and now backport them to some previous branch point, hoping nothing goes wrong or gets lost in the shuffle.

                                                                                  1. 1

                                                                                    This obviously requires a car analogy.

                                                                                    No, it requires a diagram.

                                                                                    http://jordi.platinum.linux.pl/piccies/oedipus-merge.png

                                                                                    These are the three possible situations I am talking about, after rebasing and testing. There was a base state in red, two green commits for some new feature. In all three cases, the contents of the working directory at the master branch (i.e. the commit being pointed to by “master”) are identical.

                                                                                    Now, a number of people seem to think that the no-op gold commits in the empty commit and oedipus merge situation are useful as delimiters for features. Okay, I can see that there is some utility in having delimiters for your features. What I don’t get is what utility are you getting by making those delimiters merges (and thus non-linearising your history) instead of just making them empty commits.

                                                                                    1. 1

                                                                                      a) To “just” use empty commits, wouldn’t you have to know ahead of time whether a given branch was going to be a fast-forward merge or not? b) If you want to delimit your features then all those delimiters should look the same. Why would I want to use one command to look at features which happened to have nothing committed to master while they were being developed, and another for features which happened to have another feature land while they were developed? (Also the concept of a “merge commit” is built into the tooling, making it much easier to use as a delimiter)

                                                                              3. 1

                                                                                I always use fast-forward commits but I’ve heard that merge commits make it easier to roll back when you have automated deploys.

                                                                                1. 1

                                                                                  --no-ff means it’s always possible to revert a merge. That’s why I use it.

                                                                                1. 5

                                                                                  I must(?) be misunderstanding something, but:

                                                                                  confirms next Android version won’t implement Oracle’s proprietary Java APIs

                                                                                  and

                                                                                  is replacing its implementation of the Java application programming interfaces (APIs) in Android with OpenJDK, the open source version of Oracle’s Java Development Kit (JDK)

                                                                                  feel like two contradictory statements.

                                                                                  1. 11

                                                                                    Crappy reporting indeed. Google is moving from Google’s OpenSource API to Oracle’s OpenSource API. When I read badly reported stories like this I always wonder whether stories about things I have less insight about are as inaccurate.

                                                                                    1. 1

                                                                                      That’s definitely something to wonder, yeah :|

                                                                                      1. 1

                                                                                        Michael Crichton came up with a name for this, The Gell Mann Amnesia Effect.

                                                                                        http://www.goodreads.com/quotes/65213-briefly-stated-the-gell-mann-amnesia-effect-is-as-follows-you

                                                                                        1. 3

                                                                                          It really depends on the journalist. Some reporting doesn’t suck, although a lot of it does. Make friends with smart people with domain experience and they can usually recommend good articles. Would be great if there were a better way, but people pushing biased narratives seems to be the norm.

                                                                                    1. [Comment removed by author]

                                                                                      1. 11

                                                                                        The assumption that people do not come to harm because of tech is precisely the crux of the matter. It is a discipline where sooner or later, your work has the potential to cause harm, and so if you’re not learning about the ethics of what you’re doing from very early on, you can easily do harm.

                                                                                        1. 9

                                                                                          bridges are stable because no one cares about driving over the latest and greatest bridge.

                                                                                          and nobody ever wants stuff added to bridges once they are already built. Nor does the gravity suddenly change because gravity was deprecated and has been replaced by gravity 2.0 which is much better. As a matter of fact, many buildings fail to stay upright when their operating conditions suddenly change (like earthquakes or fires)

                                                                                          The problem behind the instability isn’t (just) budget or bad engineers. It’s the constant demand for change either to the solution itself or to the foundation required by the solution.

                                                                                          1. 5

                                                                                            I think that the title can and should work for different types of software developers, and that it should require a PE-style certification. Software Engineers should be the people designing and implementing software that runs cars, elevators, healthcare systems or really any shit that can kill people. These people can’t move fast and break things. They can’t ship bugs. That kind of development should absolutely require the rigor that a PE is supposed to guarantee.

                                                                                            Other software developers, like the ones that make up many companies in the Valley don’t need to meet such requirements. People’s lives (thankfully) don’t depend on any code that I, or most others in SV, write. That’s fine. There shouldn’t be any shame in not being a software engineer, and SV companies shouldn’t require a PE to do work for them.

                                                                                            Both titles can exist, but they should absolutely imply different qualifications. Ideally, the term engineer should become less ambiguous than we’ve made it.

                                                                                            1. 6

                                                                                              I looked around, but can’t find the position paper the ACM published about fifteen years ago, explicitly refusing to participate in creating a certification process for software engineers. It had some very strong words about how no such certification would be able to guarantee or even measure the things that certifications do in other engineering fields. Given that it would be a liability-shunting fiasco that made everything worse by creating a false sense of safety, they felt they shouldn’t be involved.

                                                                                              I notice that they have a professional code of ethics, today, but that’s it.

                                                                                              1. 1

                                                                                                I’d love to read that.

                                                                                                  1. 8

                                                                                                    As background, this article is largely the ACM side of a 1999 disagreement between IEEE and ACM over Texas’s move to create a category of licensed software engineers.

                                                                                                    In 1997, the IEEE and ACM had formed a joint working group to better define the body of knowledge making up the field “software engineering”, with the main agreed goals of improving standards in the field, providing guidance to educators, and better establishing software engineering as its own discipline, rather than just a fuzzily defined variant of “computer science”. SWEBOK (SoftWare Engineering Body Of Knowledge) is the acronym for that effort, and both groups initially thought it was a good idea. Where they diverged was over the political question in 1999, when that Texas move unexpectedly arose: IEEE supported engaging with Texas’s process and thought the SWEBOK effort could be used to positively influence it, avoiding a negative outcome of Texas making up its own standards, while ACM was strongly opposed to professional licensing, and pulled out of the SWEBOK effort entirely out of fear that it might be used that way, both in Texas and elsewhere.

                                                                                                    IEEE went on to approve and publish the document in 2004. The ACM and IEEE also somewhat reconciled over part of the original agenda, and formed a different group in 2001 to develop a set of guidelines specifically for software-engineering curricula, the “Curriculum Guidelines for Undergraduate Degree Programs in Software Engineering”, which were also released in 2004. That side of the agenda has been fairly successful: there are now a lot of software-engineering degree programs, distinct from computer-science programs.

                                                                                                    1. 2

                                                                                                      Saw that last night…it’s rather telling that you can’t simply just download the damned thing.

                                                                                                      1. 1

                                                                                                        Thanks. I was young enough at the time that that background went over my head; it’s nice to close the loop on it.

                                                                                              2. 3

                                                                                                Side note: Fortran and Ada are fine languages for their domains. In fact, had Twitter used Ada instead of Ruby we wouldn’t have seen the fail whale. It is a fallacy that new is better or that old is worse.