1. 5

      I like (and agree) with the sentiment, but the argument as presented is not convincing. I suspect that’s because it’s trying too hard to push the SCM product as opposed to talk about writing commits/checkins/whatever with the reviewers in mind.

      The case presented is not compelling because it’s just as plausible and possible to do in Git (and probably Mercurial too). Maybe PlasticSCM makes it easier? I’m not sure. Regardless, the point about squashing commits is weak since you could just as easily squash the entire series commits to the smaller series presented. Furthermore, there’s no reason the commit message on the single commit that touches over 100 files can’t be as descriptive as a small series of commits to help guide the reviewers.

      1. 3

        This is how the core Mercurial team works, btw. The unit of review is the commit, not the PR (which the core hg team doesn’t even really do).

        It produces commits that are each individually understandable, which is great because your log is actually readable and contains useful information:

        https://www.mercurial-scm.org/repo/hg/

        Look at how small commits tend to be, and look at how commit messages tend to explain just what this one change is doing. This also means that your commit history is now source-level documentation thanks to hg annotate/blame. The commit message is when your tools are forcing you to write something about your code, so you should take the opportunity to actually write something meaningful.

        A history that nobody takes time to write is one that nobody takes time to read either, and at that point, what you really wanted was an ftp server to host your code with the occasional rollback mechanism to undo bad uploads.

        1. 1

          Except for the advertising section, it’s pretty similar to what I ask for my team, that they commit per component or logical unit (altough they clearly aren’t listening, maybe I need to be more strict)

          They could also propose to use rebasing to transform the checkpoint form to the reviewer form, I undesrtand it could be used for that.

        1. 9

          pushcx desktop

          Basically the same as 2015: I no longer have the bookmarks bar turned on in Firefox; I added caffiene-indicator to the systray for disabling the screensaver; I toned down my vim colorscheme to one based on jcs’s but with the LightItalic from Operator Mono for constants and comments. Still thinking of moving to xmonad, especially since July when awesomewm made some breaking changes to config files.

          1. 3

            DWM is the bestest… Awesome is a fork of it, but breaks things :D

            1. 2

              I recommend Xmonad, ive been using it for a lot of years and it never lets me down. make sure to add xmobar as well for a nice info bar.

              Ill post my setup later today.

              1. 2

                What’s the extension on the left of firefox?

                1. 1

                  Tree-Style Tab. I find the grouping of tabs by parent and being able to open and close folders very useful.

                  1. 1

                    How do you shrink down the main horizontal firefox tab bar?

                    1. 1

                      Either Tree-Style Tab removes it or I dragged it around in the usual Firefox appearance customizer.

                      1. 1

                        Hmm. Tree-Style Tabs doesn’t seem to change the main tab bar for me, and I can’t figure out a way to change it in the customizer.

                2. 2

                  We had a nice chat about fonts in #lobsters in response to this thread and there were a bunch of mono italics I hadn’t heard of before: @zpojqwfejwfhiunz shared Dank Mono, @flyingfisch shared Plex, and @zdsmith shared PragmataPro. None cursive like Operator, but very nice fonts (except for the f in Dank mono).

                  1. 1

                    I can’t find any way to view the full size image of any of these imgur posts… and the page is full of random memes.

                    1. 1

                      If you load them in the browser, open the link twice (first time works as a redirect to the image page, but once you load the image page, further requests for the image itself give you indeed the image, regardless of the Referer: field). I use wget and Geeqie, so for me they load fine the first time.

                  1. 1

                    It looks really interesting. I’ll take a note about it somewhere to try it next time I play at hackerrank. Do you have plans to try to make it work on bigger projects? it would be nice to use it with pygame projects, for example

                    1. 30

                      First time poster here 👋

                      What most closely resembles a hobby is that I’m sourcing old Romanian books online, collecting them, and will hopefully be digitizing them in the near future. I’m now focused on books that have fallen into the public domain, and XIXth to XXth century cookbooks. I’ve also been collecting found photography for a few years now, mainly from flea markets.

                      Otherwise, things that might be considered — if I were more inclined for self-reflection — extra jobs, but which I do (mostly) pro bono and/or for the fun of it:

                      I’ve been involved in a yearly event that tries to raise awareness around reintegrating the river passing through the city into the urban fabric, and we’ve just wrapped up our fourth edition.

                      I help run a tiny art gallery where we host established and upcoming artists. I occasionally chip in with web development things for local cultural institutions.

                      (I have found that these types of organizations can always use a bit of help, and just making yourself available will set you up for rewarding work.)

                      Finally, I can’t get enough of thinking about code and systems, so I’m always tinkering with some stupid little open-source project.

                      I’d like to get good at writing, archiving, and making books but I tend to start too many things at once so I’m delaying these indefinitely :-)

                      1. 3

                        I think digitizing books is an incredible hobby! You might be interested in this project: Memory of the World.

                        1. 2

                          Wow, I love it. Thanks for sharing!

                          I’m still tweaking my toolchain for digitizing the books. I have most of the parts for my guerrilla kit: Scannable on iOS + Google Vision API (because other OCR solutions are just terrible at Romanian text, especially with the older glyphs with which the language was written at the time most books I want to parse were written). Now I just need to wrap up the glue that sends a batch of images to the API and collates the results. (WIP here: https://github.com/llll-org/vizor)

                          P.S. If anyone has any references for open-source crowdsourced annotation (e.g. highlighting blocks on an image and manually typing in what you see), I’d appreciate the links. I know I have some buried in my bookmarks, but lately I’ve found that my searching / digging skills fail me more consistently than I remember — and switching to DuckDuckGo doesn’t quite help :-)

                          1. 2

                            The people at Zooniverse have a couple of projects regarding highlighting and transcribing texts, but I don’t know the requisites to add a new project

                            1. 1

                              Ah yes, this sounds like the one I was thinking of, but I also know there’s a similar open-source platform. I’ll report back if I find it.

                            2. 1

                              This is a semi-pro hardware setup: https://diybookscanner.org/

                              Jonathan is a great resource.

                          2. 2

                            I love old cookbooks. They are amazing documents. When you do digitize them and get them up, do post them here!

                            When you make books, do you hand bind them?

                            1. 2

                              Right now I think I can produce a decent digital book and work with the printer to get it published. (Did a couple of small-run ones). But yes, the dream would be to learn how to hand bind them! As for the cookbooks, I’ll let everyone know when I have something public to show, but bear in mind these are in the (old) Romanian language :P

                          1. 4

                            Hopefully they only hide www. when it is exactly at the start of the domain name, leaving duplicates and domains in the middle (like notriddle.www.github.io and www.www.lobste.rs) alone.

                            1. 43

                              How about just leaving the whole thing alone? URI/URLs are external identifiers. You don’t change someone’s name because it’s confusing. Such an arrogant move from google.

                              1. 11

                                Because we’re Google. We don’t have to care know better than you.

                                1. 3

                                  Eventually the URL bar will be so confusing and arbitrary users will just have to search google for everything.

                                  1. 5

                                    Which is of course, Google’s plan and intent, all along. Wouldn’t surprise me if they are aiming to remove URLs from the omni bar completely at some point.

                                2. 3

                                  It’s the same with Safari on Mac - not only do they hide the subdomain but everything else from the URL root onwards too. Dreadful, and the single worst (/only really bad) thing about Safari’s UI.

                                  1. 3

                                    You don’t change someone’s name because it’s confusing

                                    That’s why they’re going to try to make it a standard.
                                    They will probably also want to limit the ports that you can use with the www subdomain, or at least propose that some be hidden, like 8080

                                    1. 2

                                      Perhaps everyone should now move to w3.* or web.* names just to push back! Serious suggestion.

                                    2. 1

                                      Indeed, but I still think it is completely unnecessary and I don’t get how this “simplifies” anything

                                    1. 3

                                      I’m going to look at prices for the text book for the japanese classes I started last week :) Books are pricy over here lately, but there are a couple of places around that have it

                                        1. 2

                                          The whole thing is great, but one idea that seems particularly useful in arbitrary languages without regard to how it fits with other features is to specify the list of globals used by a function. In Python-like syntax, imagine this:

                                          def draw_quad(origin, left, up) [m]:
                                              m.Position3fv(origin - left - up)
                                              m.Position3fv(origin + left - up)
                                              m.Position3fv(origin + left + up)
                                              m.Position3fv(origin - left + up)
                                          

                                          Now you get a guarantee that the function uses zero globals besides m.

                                          1. 4

                                            In PHP you have something like that, global variables are not accesible from inside functions unless you specifically allow each one you want

                                            $m = new M();
                                            function draw_quad($orgin, $left, $up){
                                                global $m; // or $m = $_GLOBALS['m'];
                                                $m->Position3fv($origin - $left -$up);
                                                $m->Position3fv($origin + $left - $up);
                                                $m->Position3fv($origin + $left + $up);
                                                $m->Position3fv($origin - $left + $up);
                                            

                                            in practice, I haven’t found useful global variables other then the contextual ones ($_POST. $_GET, $_SESSION), which are superglobal and always defined

                                            1. 2

                                              I’d like to see something similar but generalised to “contextual” environmental bindings, rather than traditional global vars. And a compiler that ensures that somewhere in all call chains the binding exists. But you might want a global in some cases, or a “threadlocal”, or an “import” of some sort, or something like the context in react, etc.

                                              Some mechanism in which the compiler makes sure the environmental dependencies are fulfilled, without necessarily requiring that value be propagated explicitly through each owner/container between provider and consumer.

                                              1. 4

                                                I can’t find it but Scala has an extra context var that is passed with function invocation.

                                                And early Lisps had dynamic scope, meaning that a var bound to the next occurrence up the call stack.

                                                Both of these mechanisms supply the building blocks for AoP, so that a programmer can mixin orthogonal properties.

                                                1. 3

                                                  And early Lisps had dynamic scope, meaning that a var bound to the next occurrence up the call stack.

                                                  Today they still have it - see DEFVAR and DEFPARAMETER in Common Lisp.

                                                  1. 2

                                                    I can’t find it but Scala has an extra context var that is passed with function invocation.

                                                    In Scala you can use implicit parameters for this:

                                                    def foo(origin: Vec3, left: Vec3, up: Vec3)(implicit m: Context) {
                                                        m.Position3fv(origin - left - up)
                                                        m.Position3fv(origin + left - up)
                                                        m.Position3fv(origin + left + up)
                                                        m.Position3fv(origin - left + up)
                                                    }
                                                    

                                                    In Haskell you could use a reader/writer/state monad thingy. In Koka or Eff you could use effects.

                                                    1. 2

                                                      Yeah, Scala’s context is probably closest to what I’m thinking of, from what I know of it.

                                                    2. 4

                                                      You can kinda get this with effect types. Effect types let you label certain functions as using resource A or B, and then you can have a documentation mechanism for what dependencies are used, without passing stuff around.

                                                      It can still get a bit heavy (at least it is in Purescript), but less so than dependency injection

                                                    3. 1

                                                      A compiler or analyzer should be able to tell you that just from what variables or expressions go into the function. A previously-declared global would be one of the arguments. Why do we need to declare it again in the function definition?

                                                      1. 1

                                                        See my final sentence. “Now you get a guarantee that the function uses zero globals besides m.” The program documents/communicates what it uses. The compiler ensures the documentation is always up to date.

                                                        In my example, m is not one of the arguments. Because m is a global, lexically accessible anyway in the body of the function. There’s no redundancy here.

                                                        1. 0

                                                          I’m saying a compiler pass should be able to do the same thing without a language feature. I think it won’t be important to that many people. So, it will probably be optional. If optional, better as a compiler pass or static analysis than a language feature. It might also be an easy analysis.

                                                          1. 3

                                                            You’re missing the point. It’s a list of variables that the code can access anyway. What would a compiler gain by analyzing what globals the function accesses?

                                                            There are many language features that help the compiler do its job. This isn’t one of them. The whole point is documentation.

                                                            (Yes, you could support disabling the feature, using say syntax like [*]. C++ has this. Just bear in mind that compilers also don’t require code comments, and yet comments are often a good idea. SImilarly, even languages with type inference encourage documenting types in function headers.)

                                                      2. 1

                                                        What if the Position3fv method uses global variable n? You also need to specify that for draw_quad. This quickly blows up like checked exceptions in Java and people want shortcuts.

                                                        1. 2

                                                          The problem with checked exceptions is that they discourage people from using exceptions. But there’s no problem in discouraging people from using global variables.

                                                          I don’t mean to imply that I want all code to state what it needs all the time. In general I’m a pretty liberal programmer. It just seems like a good idea to give people the option to create “checkable documentation” about what globals a piece of code requires.

                                                      1. 1

                                                        If I understand the post correctly, this seems like a too big obvious failure. I kind of can’t believe Debian and Ubuntu never thought about that.

                                                        Did someone try injecting a manipulated package? I’d assume that at least the signed manifest contains not only URLs and package version but also some kind of shasum at least?

                                                        1. 2

                                                          Looks like that’s exactly what apt is doing, it verifies the checksum served in the signed manifesto: https://wiki.debian.org/SecureApt#How_to_manually_check_for_package.27s_integrity

                                                          The document mentions it uses MD5 though, maybe there’s a vector for collisions here, but it’s not as trivial as the post indicates, I’d say.

                                                          Maybe there’s marketing behind it? Packagecloud offers repositories with TLS transport…

                                                          1. 2

                                                            Modern apt repos contain SHA256 sums of all the metadata files, signed by the Debian gpg key & each individual package metadata contains that package’s SHA256 sum.

                                                            That said, they’re not wrong that serving apt repos over anything but https is inexcusable in the modern world.

                                                            1. 2

                                                              You must live on a planet where there are no users who live behind bad firewalls and MITM proxies that break HTTPS, because that’s why FreeBSD still doesn’t use HTTPS for … anything? I guess we have it for the website and SVN, but not for packages or portsnap.

                                                              1. 1

                                                                There’s nothing wrong with being able to use http if you have to: https should be the default however.

                                                                1. 1

                                                                  https is very inconvenient to do on community run mirrors

                                                                  See also: clamav antivirus

                                                                  1. 1

                                                                    In the modern world with letsencrypt it’s no where near as bad as it used to be though.

                                                                    1. 1

                                                                      I don’t think I would trust third parties to be able to issue certificates under my domain.

                                                                      It is even more complicated for clamav where servers may be responding to many different domain names based on which pools they are in. You would need multiple wildcards.

                                                              2. 1

                                                                each individual package metadata contains that package’s SHA256 sum

                                                                Is the shasum of every individual package not included in the (verified) manifesto? That would be a major issue then, as it can be forged alongside the package.

                                                                But if it is, then forging packages should require SHA256 collisions, which should be safe. And package integrity verified.

                                                                Obviously, serving via TLS won’t hurt security, but (given that letsencrypt is fairly young) depend on a centralized CA structure and additional costs - and arguably add a little more privacy on which packages you install.

                                                                1. 3

                                                                  A few days ago I was searching about this same topic when after seeing the apt update log and found this site with some ideas about it https://whydoesaptnotusehttps.com, including the point about privacy.
                                                                  I think the point about intermetdiate cache proxys and use of bandwith for the distribution servers probably adds more than the cost of a TLS certificate (many offer alternative torrent files for the live cd to offload this cost).

                                                                  Also, the packagecloud article implies that serving over TLS removes the risk of MitM, but it just makes it harder, and without certificate pinning only a little. I’d defer mostly to the marketing approach on this article, there are call-to-action sprinkled on the text

                                                                  1. 1

                                                                    https://whydoesaptnotusehttps.com

                                                                    Good resource, sums it up pretty well!

                                                                    Edit: Doesn’t answer the question about whether SHA256 sums for each individual package are included in the manifesto. But if not, all of this would make no sense, so I assume and hope so.

                                                                    1. 2

                                                                      Hi. I’m the author of the post – I strongly encourage everyone to use TLS.

                                                                      SHA256 sums of the packages are included in the metadata, but this does nothing to prevent downgrade attacks, replay attacks, or freeze attacks.

                                                                      I’ve submit a pull request to the source of “whydoesaptnotusehttps” to correct the content of the website, as it implies several incorrect things about the APT security model.

                                                                      Please re-read my article and the linked academic paper. The solution to the bugs presented is to simply use TLS, always. There is no excuse not to.

                                                                      1. 2

                                                                        TLS is a good idea, but it’s not sufficient (I work on TUF). TUF is the consequence of this research, you can find other papers about repository security (as well as current integrations of TUF) on the website.

                                                                        1. 1

                                                                          Yep, TUF is great – I’ve read quite a bit about it. Is there an APT TUF transport? If not, it seems like the best APT users can do is use TLS and hope some will write apt-transport-tuf for now :)

                                                                        2. 1

                                                                          Thanks for the post and the research!

                                                                          It’s not that easy to switch to https: A lot of repositories (incl. die official ones of Ubuntu) do not support https. Furthermore, most cloud providers proivide their own mirrors and caches. There’s no way to verify whether the whole “apt-chain” of package uploads, mirrors and caches is using https. Even if you enforce HTTPS, the described vectors (if I understood correctly) remain an issue in the mirrors/ cache scenario.

                                                                          You may be right, that current mitingations for the said vectors are not sufficient, but I feel like a security model in package management that relies on TLS is simply not sufficient and the mitingation of the attack vectors you’ve found needs to be something else - e.g. signing and verifing the packages upon installation.

                                                                    2. 2

                                                                      Is the shasum of every individual package not included in the (verified) manifesto? That would be a major issue then, as it can be forged alongside the package.

                                                                      Yes, there’s a chain of trust: the signature of each package is contained within the repo manifest file, which is ultimately signed by the Debian archive key. It’s a bit like a git archive - a chain of SHA256 sums of which only the final one needs to be signed to trust the whole.

                                                                      There are issues with http downloads - eg it reveals which packages you download, so by inspecting the data flow an attacker could find out which packages you’ve downloaded and know which attacks would be likely to be successful - but package replacement on the wire isn’t one of them.

                                                              1. 4

                                                                There’s a rule on job postings, but how should be the structure of availability postings?

                                                                1. 1

                                                                  Whatever would be useful? I’m less worried about individuals being disruptive; I do think it’s inevitable from recruiters. But please do suggest a format if you have a good idea.

                                                                1. 3

                                                                  Attempting to talk myself out of writing an RSS feed server. I’ve previously tried a couple of paid-for services and found issues with them (at the time, haven’t re-evaluated), and also tried out a mac feed app I love but the sync service has issues and the support team just goes quiet on them (Majorly sad about that.) Tried out ttrss but found it doesn’t do things I consider essential, like importing an OPML of feeds. I don’t want to write a feed server, but I can’t find anything to solve my problem. (Decent fetcher/parser, OPML import, supports fever API. Probably doesn’t expose a web reader interface.)

                                                                  Finally got to a place that I’m happy with my hetzner box(es). En-mass migrated from the slower box to the faster box & cancelled the slower one. (Moving VMs between SmartOS hyps is pretty simple. Yay ZFS.) Got the usual LAMP stack setup to run my stuff, split out nicely enough and managed with puppet where appropriate. Also stuck a load-balancer on the public IP, so internally different things can run in isolated zones without using up more IPs. Simples ;-)

                                                                  Also spent yesterday hiking up the Wrekin (Big hill in Shropshire UK) in the snow, then building a snow-woman larger than me & an igloo with the kids in the garden. First decent snow we’ve had in 7 years, gotta make the most of that.

                                                                  1. 7

                                                                    I have been using ttrss for a few years now without a problem, mostly for webcomics and news. There is actually an OPML importer from the preferences menus

                                                                    1. 1

                                                                      Ah useful to know, thanks

                                                                    2. 5

                                                                      I like RSS (and ATOM) themselves (sure they could be improved, but there are enough semi-documented versions as it is!) but I think the tooling suffers from a lot of not-invented-here syndrome and/or lack of UNIX-style “do one thing well”.

                                                                      For example, my current setup converts RSS posts to maildir messages. There are loads of tools to do this which will periodically check a URL and convert any new posts they find. Not only is this redundant (cron can execute commands periodically and curl can fetch URLs), but the only compelling feature of these tools (their conversion algorithm) can (AFAIK) only be invoked on URLs; yet I also have RSS files I’d like to convert. At one point I was firing up a local web server to make these files available to these tools, but I found this so ridiculous I eventually decided to just fork ‘feed2maildir’ and rip out all of its “features” to get a standalone rss-to-maildir converter.

                                                                      1. 1

                                                                        I like RSS (and ATOM) themselves (sure they could be improved, but there are enough semi-documented versions as it is!) but I think the tooling suffers from a lot of not-invented-here syndrome and/or lack of UNIX-style “do one thing well”.

                                                                        Thanks for posting this, I’ve been thinking about it on and off since you mentioned it and finally came up with a pipeline using smaller tools I prefer to just a single binary that does everything backed by a single db. I guess this means I’m writing a feed server. :->

                                                                    1. 7

                                                                      I’m reading Catherine the Great for fun and Emergency Care and Transportation of the Sick and Injured to prep for the upcoming EMT classes. I just finished The Storm Before the Storm, which was fantastic. I love everything Mike Duncan does.

                                                                      For classic science fiction, here are some of my favorites:

                                                                      • Dune, Frank Herbert. The first book deserved every once of praise it got. The second book is pretty good. I wasn’t a fan of the third book. The fourth is awful, and they just get worse from there.
                                                                      • Lord of Light, Roger Zelazny. One of my favorite SF books ever. Con artist pretending to be Buddha tries to free humanity from con artists pretending to be Hindu gods, in the process accidentally recreating most of the Buddhist myths.
                                                                      • Book of the New Sun, Gene Wolfe. A quintilogy that was one of the foundational books in the “Dying Earth” style. A professional torturer is exiled from the guild and wanders around a bunch and sort of becomes Jesus Christ. I like it as an example of how you can write a religious novel without it being overbearing or alienating to nonbelievers. Another good book like that is Small Gods by Terry Pratchett, but that’s more fantasy than SF.
                                                                      • Embassytown, China Mieville. Not a classic (came out in the past decade), but my favorite Mieville.
                                                                      1. 3

                                                                        I love Lord of Light, it’s one of the few books I have read more than once

                                                                        1. 3

                                                                          EMT classes/exams aren’t too bad. Practicals/clinicals are nerve wracking. Using EMR/foosoftware is stepping on legos.

                                                                          Check this out if you have five minutes emin5. Her youtube channel is filled with useful videos.

                                                                        1. 2

                                                                          I’m currently reading “The name of the rose”, but going slow on it, and also got hooked up on reading scp-wiki while commuting

                                                                          1. 2

                                                                            You might consider giving this a look. I found it really enhanced my appreciation and understanding of the book. https://www.amazon.com/Key-Name-Rose-Translations-Non-English/dp/0472086219

                                                                            1. 2

                                                                              I guess it would be really nice to understand what all the latin phrases mean

                                                                          1. 1

                                                                            At work: same routine apps, no new or interesting projects on sight; expanding the internal framework when I have time.

                                                                            At home: looking at job postings, it would be nice to move to europe next year. Even as I have almost ten years of experience, I’m lacking on a few techs the most interesting projects require, unit testing most prominently. I’ll try this week to start with side projects for learning and testing things, I was thinking of a finder of discounted games sraping bundle sites