1. 2

      The whole thing is great, but one idea that seems particularly useful in arbitrary languages without regard to how it fits with other features is to specify the list of globals used by a function. In Python-like syntax, imagine this:

      def draw_quad(origin, left, up) [m]:
          m.Position3fv(origin - left - up)
          m.Position3fv(origin + left - up)
          m.Position3fv(origin + left + up)
          m.Position3fv(origin - left + up)
      

      Now you get a guarantee that the function uses zero globals besides m.

      1. 4

        In PHP you have something like that, global variables are not accesible from inside functions unless you specifically allow each one you want

        $m = new M();
        function draw_quad($orgin, $left, $up){
            global $m; // or $m = $_GLOBALS['m'];
            $m->Position3fv($origin - $left -$up);
            $m->Position3fv($origin + $left - $up);
            $m->Position3fv($origin + $left + $up);
            $m->Position3fv($origin - $left + $up);
        

        in practice, I haven’t found useful global variables other then the contextual ones ($_POST. $_GET, $_SESSION), which are superglobal and always defined

        1. 2

          I’d like to see something similar but generalised to “contextual” environmental bindings, rather than traditional global vars. And a compiler that ensures that somewhere in all call chains the binding exists. But you might want a global in some cases, or a “threadlocal”, or an “import” of some sort, or something like the context in react, etc.

          Some mechanism in which the compiler makes sure the environmental dependencies are fulfilled, without necessarily requiring that value be propagated explicitly through each owner/container between provider and consumer.

          1. 4

            I can’t find it but Scala has an extra context var that is passed with function invocation.

            And early Lisps had dynamic scope, meaning that a var bound to the next occurrence up the call stack.

            Both of these mechanisms supply the building blocks for AoP, so that a programmer can mixin orthogonal properties.

            1. 3

              And early Lisps had dynamic scope, meaning that a var bound to the next occurrence up the call stack.

              Today they still have it - see DEFVAR and DEFPARAMETER in Common Lisp.

              1. 2

                I can’t find it but Scala has an extra context var that is passed with function invocation.

                In Scala you can use implicit parameters for this:

                def foo(origin: Vec3, left: Vec3, up: Vec3)(implicit m: Context) {
                    m.Position3fv(origin - left - up)
                    m.Position3fv(origin + left - up)
                    m.Position3fv(origin + left + up)
                    m.Position3fv(origin - left + up)
                }
                

                In Haskell you could use a reader/writer/state monad thingy. In Koka or Eff you could use effects.

                1. 2

                  Yeah, Scala’s context is probably closest to what I’m thinking of, from what I know of it.

                2. 4

                  You can kinda get this with effect types. Effect types let you label certain functions as using resource A or B, and then you can have a documentation mechanism for what dependencies are used, without passing stuff around.

                  It can still get a bit heavy (at least it is in Purescript), but less so than dependency injection

                3. 1

                  A compiler or analyzer should be able to tell you that just from what variables or expressions go into the function. A previously-declared global would be one of the arguments. Why do we need to declare it again in the function definition?

                  1. 1

                    See my final sentence. “Now you get a guarantee that the function uses zero globals besides m.” The program documents/communicates what it uses. The compiler ensures the documentation is always up to date.

                    In my example, m is not one of the arguments. Because m is a global, lexically accessible anyway in the body of the function. There’s no redundancy here.

                    1. 0

                      I’m saying a compiler pass should be able to do the same thing without a language feature. I think it won’t be important to that many people. So, it will probably be optional. If optional, better as a compiler pass or static analysis than a language feature. It might also be an easy analysis.

                      1. 3

                        You’re missing the point. It’s a list of variables that the code can access anyway. What would a compiler gain by analyzing what globals the function accesses?

                        There are many language features that help the compiler do its job. This isn’t one of them. The whole point is documentation.

                        (Yes, you could support disabling the feature, using say syntax like [*]. C++ has this. Just bear in mind that compilers also don’t require code comments, and yet comments are often a good idea. SImilarly, even languages with type inference encourage documenting types in function headers.)

                  2. 1

                    What if the Position3fv method uses global variable n? You also need to specify that for draw_quad. This quickly blows up like checked exceptions in Java and people want shortcuts.

                    1. 2

                      The problem with checked exceptions is that they discourage people from using exceptions. But there’s no problem in discouraging people from using global variables.

                      I don’t mean to imply that I want all code to state what it needs all the time. In general I’m a pretty liberal programmer. It just seems like a good idea to give people the option to create “checkable documentation” about what globals a piece of code requires.

                  1. 1

                    If I understand the post correctly, this seems like a too big obvious failure. I kind of can’t believe Debian and Ubuntu never thought about that.

                    Did someone try injecting a manipulated package? I’d assume that at least the signed manifest contains not only URLs and package version but also some kind of shasum at least?

                    1. 2

                      Looks like that’s exactly what apt is doing, it verifies the checksum served in the signed manifesto: https://wiki.debian.org/SecureApt#How_to_manually_check_for_package.27s_integrity

                      The document mentions it uses MD5 though, maybe there’s a vector for collisions here, but it’s not as trivial as the post indicates, I’d say.

                      Maybe there’s marketing behind it? Packagecloud offers repositories with TLS transport…

                      1. 2

                        Modern apt repos contain SHA256 sums of all the metadata files, signed by the Debian gpg key & each individual package metadata contains that package’s SHA256 sum.

                        That said, they’re not wrong that serving apt repos over anything but https is inexcusable in the modern world.

                        1. 2

                          You must live on a planet where there are no users who live behind bad firewalls and MITM proxies that break HTTPS, because that’s why FreeBSD still doesn’t use HTTPS for … anything? I guess we have it for the website and SVN, but not for packages or portsnap.

                          1. 1

                            There’s nothing wrong with being able to use http if you have to: https should be the default however.

                            1. 1

                              https is very inconvenient to do on community run mirrors

                              See also: clamav antivirus

                              1. 1

                                In the modern world with letsencrypt it’s no where near as bad as it used to be though.

                                1. 1

                                  I don’t think I would trust third parties to be able to issue certificates under my domain.

                                  It is even more complicated for clamav where servers may be responding to many different domain names based on which pools they are in. You would need multiple wildcards.

                          2. 1

                            each individual package metadata contains that package’s SHA256 sum

                            Is the shasum of every individual package not included in the (verified) manifesto? That would be a major issue then, as it can be forged alongside the package.

                            But if it is, then forging packages should require SHA256 collisions, which should be safe. And package integrity verified.

                            Obviously, serving via TLS won’t hurt security, but (given that letsencrypt is fairly young) depend on a centralized CA structure and additional costs - and arguably add a little more privacy on which packages you install.

                            1. 3

                              A few days ago I was searching about this same topic when after seeing the apt update log and found this site with some ideas about it https://whydoesaptnotusehttps.com, including the point about privacy.
                              I think the point about intermetdiate cache proxys and use of bandwith for the distribution servers probably adds more than the cost of a TLS certificate (many offer alternative torrent files for the live cd to offload this cost).

                              Also, the packagecloud article implies that serving over TLS removes the risk of MitM, but it just makes it harder, and without certificate pinning only a little. I’d defer mostly to the marketing approach on this article, there are call-to-action sprinkled on the text

                              1. 1

                                https://whydoesaptnotusehttps.com

                                Good resource, sums it up pretty well!

                                Edit: Doesn’t answer the question about whether SHA256 sums for each individual package are included in the manifesto. But if not, all of this would make no sense, so I assume and hope so.

                                1. 2

                                  Hi. I’m the author of the post – I strongly encourage everyone to use TLS.

                                  SHA256 sums of the packages are included in the metadata, but this does nothing to prevent downgrade attacks, replay attacks, or freeze attacks.

                                  I’ve submit a pull request to the source of “whydoesaptnotusehttps” to correct the content of the website, as it implies several incorrect things about the APT security model.

                                  Please re-read my article and the linked academic paper. The solution to the bugs presented is to simply use TLS, always. There is no excuse not to.

                                  1. 2

                                    TLS is a good idea, but it’s not sufficient (I work on TUF). TUF is the consequence of this research, you can find other papers about repository security (as well as current integrations of TUF) on the website.

                                    1. 1

                                      Yep, TUF is great – I’ve read quite a bit about it. Is there an APT TUF transport? If not, it seems like the best APT users can do is use TLS and hope some will write apt-transport-tuf for now :)

                                    2. 1

                                      Thanks for the post and the research!

                                      It’s not that easy to switch to https: A lot of repositories (incl. die official ones of Ubuntu) do not support https. Furthermore, most cloud providers proivide their own mirrors and caches. There’s no way to verify whether the whole “apt-chain” of package uploads, mirrors and caches is using https. Even if you enforce HTTPS, the described vectors (if I understood correctly) remain an issue in the mirrors/ cache scenario.

                                      You may be right, that current mitingations for the said vectors are not sufficient, but I feel like a security model in package management that relies on TLS is simply not sufficient and the mitingation of the attack vectors you’ve found needs to be something else - e.g. signing and verifing the packages upon installation.

                                2. 2

                                  Is the shasum of every individual package not included in the (verified) manifesto? That would be a major issue then, as it can be forged alongside the package.

                                  Yes, there’s a chain of trust: the signature of each package is contained within the repo manifest file, which is ultimately signed by the Debian archive key. It’s a bit like a git archive - a chain of SHA256 sums of which only the final one needs to be signed to trust the whole.

                                  There are issues with http downloads - eg it reveals which packages you download, so by inspecting the data flow an attacker could find out which packages you’ve downloaded and know which attacks would be likely to be successful - but package replacement on the wire isn’t one of them.

                          1. 4

                            There’s a rule on job postings, but how should be the structure of availability postings?

                            1. 1

                              Whatever would be useful? I’m less worried about individuals being disruptive; I do think it’s inevitable from recruiters. But please do suggest a format if you have a good idea.

                            1. 3

                              Attempting to talk myself out of writing an RSS feed server. I’ve previously tried a couple of paid-for services and found issues with them (at the time, haven’t re-evaluated), and also tried out a mac feed app I love but the sync service has issues and the support team just goes quiet on them (Majorly sad about that.) Tried out ttrss but found it doesn’t do things I consider essential, like importing an OPML of feeds. I don’t want to write a feed server, but I can’t find anything to solve my problem. (Decent fetcher/parser, OPML import, supports fever API. Probably doesn’t expose a web reader interface.)

                              Finally got to a place that I’m happy with my hetzner box(es). En-mass migrated from the slower box to the faster box & cancelled the slower one. (Moving VMs between SmartOS hyps is pretty simple. Yay ZFS.) Got the usual LAMP stack setup to run my stuff, split out nicely enough and managed with puppet where appropriate. Also stuck a load-balancer on the public IP, so internally different things can run in isolated zones without using up more IPs. Simples ;-)

                              Also spent yesterday hiking up the Wrekin (Big hill in Shropshire UK) in the snow, then building a snow-woman larger than me & an igloo with the kids in the garden. First decent snow we’ve had in 7 years, gotta make the most of that.

                              1. 7

                                I have been using ttrss for a few years now without a problem, mostly for webcomics and news. There is actually an OPML importer from the preferences menus

                                1. 1

                                  Ah useful to know, thanks

                                2. 5

                                  I like RSS (and ATOM) themselves (sure they could be improved, but there are enough semi-documented versions as it is!) but I think the tooling suffers from a lot of not-invented-here syndrome and/or lack of UNIX-style “do one thing well”.

                                  For example, my current setup converts RSS posts to maildir messages. There are loads of tools to do this which will periodically check a URL and convert any new posts they find. Not only is this redundant (cron can execute commands periodically and curl can fetch URLs), but the only compelling feature of these tools (their conversion algorithm) can (AFAIK) only be invoked on URLs; yet I also have RSS files I’d like to convert. At one point I was firing up a local web server to make these files available to these tools, but I found this so ridiculous I eventually decided to just fork ‘feed2maildir’ and rip out all of its “features” to get a standalone rss-to-maildir converter.

                                  1. 1

                                    I like RSS (and ATOM) themselves (sure they could be improved, but there are enough semi-documented versions as it is!) but I think the tooling suffers from a lot of not-invented-here syndrome and/or lack of UNIX-style “do one thing well”.

                                    Thanks for posting this, I’ve been thinking about it on and off since you mentioned it and finally came up with a pipeline using smaller tools I prefer to just a single binary that does everything backed by a single db. I guess this means I’m writing a feed server. :->

                                1. 7

                                  I’m reading Catherine the Great for fun and Emergency Care and Transportation of the Sick and Injured to prep for the upcoming EMT classes. I just finished The Storm Before the Storm, which was fantastic. I love everything Mike Duncan does.

                                  For classic science fiction, here are some of my favorites:

                                  • Dune, Frank Herbert. The first book deserved every once of praise it got. The second book is pretty good. I wasn’t a fan of the third book. The fourth is awful, and they just get worse from there.
                                  • Lord of Light, Roger Zelazny. One of my favorite SF books ever. Con artist pretending to be Buddha tries to free humanity from con artists pretending to be Hindu gods, in the process accidentally recreating most of the Buddhist myths.
                                  • Book of the New Sun, Gene Wolfe. A quintilogy that was one of the foundational books in the “Dying Earth” style. A professional torturer is exiled from the guild and wanders around a bunch and sort of becomes Jesus Christ. I like it as an example of how you can write a religious novel without it being overbearing or alienating to nonbelievers. Another good book like that is Small Gods by Terry Pratchett, but that’s more fantasy than SF.
                                  • Embassytown, China Mieville. Not a classic (came out in the past decade), but my favorite Mieville.
                                  1. 3

                                    I love Lord of Light, it’s one of the few books I have read more than once

                                    1. 3

                                      EMT classes/exams aren’t too bad. Practicals/clinicals are nerve wracking. Using EMR/foosoftware is stepping on legos.

                                      Check this out if you have five minutes emin5. Her youtube channel is filled with useful videos.

                                    1. 2

                                      I’m currently reading “The name of the rose”, but going slow on it, and also got hooked up on reading scp-wiki while commuting

                                      1. 2

                                        You might consider giving this a look. I found it really enhanced my appreciation and understanding of the book. https://www.amazon.com/Key-Name-Rose-Translations-Non-English/dp/0472086219

                                        1. 2

                                          I guess it would be really nice to understand what all the latin phrases mean

                                      1. 1

                                        At work: same routine apps, no new or interesting projects on sight; expanding the internal framework when I have time.

                                        At home: looking at job postings, it would be nice to move to europe next year. Even as I have almost ten years of experience, I’m lacking on a few techs the most interesting projects require, unit testing most prominently. I’ll try this week to start with side projects for learning and testing things, I was thinking of a finder of discounted games sraping bundle sites