Threads for schmudde

  1. 3

    Company: Yorba

    Company site: https://yorba.co

    Position(s): Clojure Developer

    Location: Fully Remote

    Description: Yorba offers simple, secure identity management. The service aggregates a person’s online presence and provides insights, guidance, and a set of organizing principles to manage personal information. Identity is actionable through open standards, available APIs, and partner interoperation.

    We’re committed to building a more ethical internet. These aren’t just words. Yorba is a Public Benefit Corporation, which means that we legally cannot prioritize profits at the expense of our core values, our members, or our promises. Furthermore, we’re building on open standards; our product must win on the merits of our user experience and the trust we build with our members. We believe that people are tired of the status quo and Yorba offers a way to make meaningful change.

    Tech stack: Clojure, Google Cloud, Node, Literate Programming

    Compensation: market rates, flexible hours, equity possible

    Contact: schmudde@yorba.co

    1. 5

      I’ve been using this workflow for a while and I absolutely love it. I tangle (generate) multiple files that have interdependent documentation. And don’t even get me started on configuration files. Generating all of it from one well-documented source is a real revelation.

      1. 3

        He kind of speculates on a “what could have been” but I’m actually not sure what differentiates “Groupware” from what we have today. Is it less Unix-y?

        The first time I encountered this idea was Microsoft Windows 3.11, aka “Windows For Workgroups”. If we take Microsoft at their word, it means that non-collaborative personal computing only really existed from 1977-1994 at the latest. Pretty short time in computing.

        AppleTalk already existed and people were networking home computers before 1994. If we count Xerox’s office solutions, the window shrinks even more.

        1. 2

          Let’s be clear on the premise of this argument: the code has a liberal license but the output of the code should not.

          I’m not sure this is a novel debate. Lingdong wants to restrict the code’s use in a commercial context. I’ve talked to other devs that longed to limit their code’s use in the context of state violence.

          As the article points out, there seems to be no solution here from the Open Source Initiative. But the OSI’s position is almost certainly the only rational one. Property right restrictions will continue to seem absurd absent of a functioning digital commons.

          1. 1

            I agree that as long as FLOSS licenses are basically a clever hack on copyright, you can’t really realistically get rid of Freedom Zero. But what this situation points to is what I think is becoming a growing distaste of the fundamentally exploitative nature of some open-source - a creator releases something to the world, and it is immediately harnessed to produce profit.

            The emergence of NTF enthusiasts[1] has merely closed the loop from months to days. Generative art is catnip to these people, as it enables them to produce any number of unique items to “mint”, without having to cut any profits to artists. The fact that they quickly backpedalled in the face of massive criticism is testament to the power of moral suasion, and the fact that the thin fig-leaf covering the naked greed of NFT purveyors is the fiction they’re helping artists. One does not kill the golden goose.

            [1] scientific name: nftardius parasiticus

            1. 1

              The emergence of NTF enthusiasts[1] has merely closed the loop from months to days.

              It’s an interesting idea that NFTs are making this exploitation more immediate. But it seems that the sort of contracts that could fix this could only come in the form of code.

              For example, many NFT contracts provide a mechanism to pay artists upon resale. This has a rich history throughout the 20th century, but it was really hard to make it ubiquitous and permanent. And now here we are.

              But the only way to prevent cut ’n paste jobs is to distribute a binary or couch it in a more complex (and hidden) system. This seems like its own nightmare.

          1. 6

            I truly hate the fact that nil stands for “the empty list”, “false”, “undefined result” and “error” in Common Lisp. Its the worst.

            1. 5

              It really works out well in my experience. Also, as alexandria points out, errors in Lisp are typically represented with conditions, not NILs.

              FWIW, I truly hate that Scheme separates #f, NIL and ’().

              1. 4

                I truly hate that Scheme separates #f, NIL and ’()

                I don’t like it either. But:

                1. Scheme has no NIL. (I mean, you could have a symbol named ‘nil’, but it doesn’t mean anything in particular.)

                2. The bigger wart (imo) is that () is not self-evaluating.

                1. 1

                  In my scheme programs I always denote the empty as (list).

                2. 2

                  If you look at this from a non-Lisp perspective, it makes a lot of sense that an empty list is not the same thing as a boolean false value, just like an empty string is not the same thing as zero.

                  nil being the empty list and the universal falsy value is entirely a Lisp thing.

                  1. 2

                    Not even a Lisp thing but a MacLisp thing (or maybe even a Lisp 1.5 thing?) which got inherited by MacLisp descendants Common Lisp, InterLisp, Emacs Lisp, etc.

                  2. 1

                    (cadr nil) is an error by almost any conceivable standard1 but it results in nil.

                    I admit that in practice it typically doesn’t cause trouble, but it drives me crazy that this is somehow an allowed thing you can say in common lisp:

                    (caddr (= 3 10))


                    1 Ok I’m being a little salty here.

                    I know its a silly thing to get caught up on but it just feels wrong to me.

                    1. 2

                      Ok but False is NIL, because (= 3 10) is NIL. The empty list is also NIL. Taking the CAR of the CDR of the CDR of an empty list, returns an empty list. Since CDR returns an empty list (NIL) when given an empty list (Because the CDR of the list is empty) and since CAR gives NIL when given an empty list (Because the value of the CAR is NIL), it’s just a byproduct of the chain of evaluation. None of these are error conditions in and of their own right, so the result is not an error condition either.

                      I know you understand this, and still feel that this should be an error (which I can understand except for the fact that it would make the LISP interpreter more complex to implement), I wrote this for the people who are reading this discussion.

                      1. 3

                        it would make the LISP interpreter more complex to implement

                        (disassemble (lambda (x) (car x)))
                        ; disassembly for (LAMBDA (X))
                        ; Size: 31 bytes. Origin: #x53571660                          ; (LAMBDA (X))
                        ; 60:       498B5D10         MOV RBX, [R13+16]                ; thread.binding-stack-pointer
                        ; 64:       48895DF8         MOV [RBP-8], RBX
                        ; 68:       8D50F9           LEA EDX, [RAX-7]
                        ; 6B:       F6C20F           TEST DL, 15
                        ; 6E:       7403             JEQ L0
                        ; 70:       CC49             INT3 73                          ; OBJECT-NOT-LIST-ERROR
                        ; 72:       00               BYTE #X00                        ; RAX
                        ; 73: L0:   488B50F9         MOV RDX, [RAX-7]
                        ; 77:       488BE5           MOV RSP, RBP
                        ; 7A:       F8               CLC
                        ; 7B:       5D               POP RBP
                        ; 7C:       C3               RET
                        ; 7D:       CC10             INT3 16                          ; Invalid argument count trap
                        
                        1. Compiler, not interpreter

                        2. It already requires a type check; it would complicate nothing to make (car nil) an error

                    1. 1

                      Yes, I programmed in CL for a few years at a startup. Quite familiar with the language. See my response to rau.

                      1. 2

                        I did, and I disagree that that represents an error. It’s correct behaviour of the system. If False was represented by a separate value, or Lisp was more strict with types, then sure it would be an error. But since any false expression returns an empty list, and since the interpreter is not literally doing pointer lookups, it is not in any way an error.

                        1. 2

                          The issue with me is that there are things which you can express as valid programs which run and even produce values which don’t really make sense at a type level. I know its impossible for mathematical reasons that a programming language prevent you from denoting nonsense statically (and I’m not even really sold on the idea that such static guarantees are that useful or ergonomic) but I do sort of like when a language at least stops when something silly has happened.

                          If one sees the expression (cadr x) its pretty reasonable to expect that x denotes a list. It also seems reasonable that x is a non-empty list. But CL doesn’t guarantee either of those things are true. It doesn’t necessarily bother me too much that cadr returns a value instead of throwing an error when it receives a list which doesn’t have a cadr, but nil seems to me to be the wrong value. Why?

                          Because:

                          (cadr nil)
                          (cadr (list 'a nil))
                          

                          Return the same value. In other words, if you have a list with empty sublists (which doesn’t even seem particularly unusual a situation) then you can’t count on cadr to tell you about the cadr of the list! It would be better of cadr was called maybe-cadr and it returned a user provided sentinel value (default nil, perhaps) in the event of a list without a cadr.

                          All this goes back to the idea that weird circumstances should halt programs as quickly as possible. Given that many of these list destructuring functions fold nil, its possible, and I have even encountered situations where, an unusual condition doesn’t raise a real error until long after the offending list is off the stack and beyond the debugger.

                    2. 1

                      nil punning ftw… until it is null pointer error.

                    1. 23

                      My practice routine used to consist of idling on ##c on freenode, looking at questions people were having with regards to C and solving any code problems they come up with. I usually made sure that if I were to send the person the code it was either instructive (in the case of outright rewriting what they wrote) or didn’t completely solve the problem (in the case of questions on how to do something). This meant I could solve problems I would not normally spend my time solving, keep my understanding of C very sharp and provide a high quality of help to people asking in the channel.

                      1. 12

                        This is both brilliant and obvious. Obvious in the sense that helping others helps yourself; it’s a tried and tested method. Brilliant in the way that skill sharing is not something ingrained in the culture.

                        Don’t get me wrong - there are a lot of places to get help on the internet and that’s a great thing. But you’ll know it’s part of the culture when “productivity” is measured - in part - by the amount that you help other people.

                        1. 2

                          Ex #linux (IRCNet) and #csharp (Freenode) hanger-outer here, learning in the same way. Free time? See an interesting question? Try to solve it. If it seems like it’ll help, post it. The original requestor or others provide more info and you end up with a full picture. A fantastic way to learn.

                          1. 2

                            Did you do this on your own time or as psrt if your job? For the discussion on the industry culture that would make a big difference.

                            1. 2

                              I did this entirely on my own time.

                          1. 3

                            However, if iOS was never a tool to be used as one wishes, but rather being used in the limited ways Apple foresaw, iCloud never reached the potential like replacing Google Mail or Google Drive. Never open, never as powerful, no API.

                            It’s kind of impressive how bad Apple is at the ‘i’ part of the iMac, iPhone, etc….

                            But if Apple had their way, the internet would look more like a minimalist AOL and less like the organic, human-made, wonderful mess that it currently embodies.

                            1. 2

                              Go is so far away from Smalltalk. And … wow … our community has even forgot the Pascal vs. C wars.

                              1. 3

                                our community has even forgot the Pascal vs. C wars.

                                We really need comp sci students to be taught the ‘modern’ history of the industry.

                                1. 2

                                  Go is so far away from Smalltalk

                                  Yeah a friend even commented that he mentions the superiority of interactive programming (Smalltalk, Interlisp) at the beginning and then ends up at Go. Not the most interactive contemporary choice.

                                1. 2

                                  The oldest thing I know of offhand is a site I made in 97 and kept updating for a couple of years before abandoning it.. Which you can see in the text. Anyway, here you go, a time when webrings and frames were still ok: still on tripod!

                                  1. 1

                                    I love the design on the image selection.

                                  1. 2

                                    At this point it’s probably my YouTube channel. I used to have youtube.com/shadowh511, but in terms of account age that’s one of the oldest web presences that I still have. That youtube channel was made back a whole gender ago when I was in middle school and has always just avoided being a part of the partner program. I sometimes wonder what would have happened if i focused on it as a main career prosepect.

                                    1. 1

                                      The most awesome thing ever from 2008 is still pretty awesome. https://www.youtube.com/watch?v=lFPlcrAdP34

                                    1. 2

                                      My 18 year old personal web site.

                                      It used to be for an old dial up ISP that I worked at in the 90s that hadn’t shut down their site from ‘98. But, I linked to it in my newsletter once and then influx of traffic must have been odd and they finally killed the spot.

                                      1. 1

                                        Boooooo. Did Internet Archive manage to capture it before it went down?

                                        1. 1

                                          I’m certain it did. But, it’s down right now. But now I feel like I need to capture that history somehow.

                                      1. 13

                                        Nuclear take: I think it’s interesting so many “computer engineering/enthusiast” types (for lack of a better term) tended to gravitate towards DEC systems when their design is full of bonkers mistakes no EE should repeat: PDP-10’s recursive indirect addressing, PDP-11’s segmentation and PC in memory (ok, DSPs do this, but that’s an acceptable optimization for a DSP, not a general-purpose CPU), the absurd CISCiness of VAX, etc. (Alpha was pretty reasonable.) I say this as someone who likes VMS.

                                        I think 360/370 is much better designed, and the influence in modern CPUs design is more obvious (lots of GPRs, clean instruction formats, pipelining, virtualization, etc.). Plus they had the also influential ACS/Stretch to draw from. I can’t say the same for many DEC designs. It’s amusing Unix types are so obsessed with VAX when Unix would feel far more at home on 370.

                                        1. 5

                                          I suspect a variety of factors are to blame:

                                          IBM in the ’70s and ’80s had the reputation that Microsoft had in the ‘90s and 2000s and Google, Amazon, and Facebook are competing for now: the evil empire monopolist that the rest of the industry stands against. There’s a story around the founding of Sun that they got a visit a few months in from the IBM legal department inviting them to sign a patent cross-licensing agreement and showing six patents that Sun might be infringing. Scott McNealy sat them down and demonstrated prior art for some and that Sun wasn’t infringing any of the ones that might be valid. The IBM lawyers weren’t phased by this and said ‘you might not be infringing these, would you like us to find some that you are?’ Sun signed the cross-licensing agreement. This kind of thing is why IBM’s legal department was referred to as the Nazgul. To add to this, IBM was famously business-facing. They required programmers to wear suits and ties. The hacker ‘uniform’ of jeans and t-shirts was a push-back against companies like IBM in general and IBM in particular and hacker culture in general was part of a counter-culture rebellion where IBM was the archetype of the mainstream against which they were rebelling.

                                          The DEC machines were so closely linked to the development of UNIX. IBM’s biggest contribution with the 360 was the idea that software written for one computer could run on another. This meant that their customers were able to build up a large amount of legacy software by the ’80s so IBM had no incentive to encourage people to write new systems software for their machines: quite the reverse, they wanted you locked in. DEC encouraged this kind of experimentation. Universities may have had an IBM mainframe for the admin department to use but the computer science departments and research groups bought DEC (and other small-vendor) machines to tinker with.

                                          Multics was developed for the GE45, which had all manner of interesting features (including a segmentation model that allowed a single-level store and no distinction between shared libraries and processes), Unics was written for the tiny PDP in the corner and it grew with that line.

                                          There were a lot of other big-iron systems suffered from the rise of UNIX. I’m particularly sad about the Burroughs Large Systems architecture. The B5000 was released at almost the same time as the 360 and had an OS written in a high-level language (Algol-60), with hardware-assisted garbage collection, and provided a fully memory-safe (and mostly type-safe) environment with hardware enforcement. Most modern language VMs (JVM, CLR, and so on) are attempts to emulate something close to the B5000 on a computer that exposes an abstract machine that is basically a virtualised PDP-11. I wish CPU vendors would get the hint: if the first thing people do when they get your CPU is use it to emulate one with a completely different abstract machine, you’ve done something wrong.

                                          Oh, and before you criticise the VAX for being too CISCy (and, yes, evaluate polynomial probably doesn’t need to be a single instruction), remember that the descendants of the 360 have instructions for converting strings between EBCDIC and unicode.

                                          1. 2

                                            I think you exaggerate about the IBM. There is a general 1:1 table based translate which can do EBCDIC to ACII or Unicode, and there are different instructions for converting between the different Unicode flavours. It can’t do it in one instruction, that I know of.

                                            But anyway, those and VAX POLY aren’t the problem. You can happily use microcode or just trap and emulate and no one will care.

                                            The problem with the VAX is that the extremely common ADDL3 instruction (to name just one) can vary in length from 4 to 19 bytes and cause half a dozen memory references / cache misses / page faults.

                                            x86, for all its ugliness, never uses more than one memory address per instruction for common instructions e.g. code generated from C. Same for S/360. Both have string instructions, but those are not a big deal, and relatively uncommon.

                                          2. 3

                                            That’s an interesting observation.

                                            I think there would be a lot to learn from comparing the two engineering cultures. I would specifically include the management style and the kind of money each company was dealing with. When IBM was developing ground-breaking products like the Stretch and the Selectric typewriters, half of the company’s income came from incredibly lucrative military contracts.

                                            The kinds of pressures on an engineering team and the corner/cost-cutting they may take is dramatically different when they are awash with money.

                                            1. 3

                                              To elaborate, I feel the DEC did more influence to product segments than they did engineering. The PDP-8 and then PDP-11 redefined minicomputers, but the PDP-8’s influence was short-lived and the PDP-11’s influence….would have rather not been felt (i.e x86).

                                          1. 5

                                            I may need the 16 port FW800 hub at the end of the piece. Not sure why… but I may need it.

                                            1. 2

                                              For access rights, I would strongly suggest converting everything over to SAML. If you have GSuite, you already have a SAML IDP included in your purchase.

                                              For everything else, I would set up a shared spreadsheet with finance/accounts receivable. It’s also worth starting a shared drive between ops, legal and finance where you keep all of the executed contracts. When you need them, its really important they are quickly located.

                                              1. 1

                                                Great advice - thanks!

                                              1. 2

                                                Interesting history. Based on a conversation with @mjn, it appears that Weizenbaum’s SLIP was the first programming language to use reference counting. Finally I get to see a SLIP program.

                                                More context: SLIP was a list processing language, created a few years after LISP. LISP used a tracing garbage collector (a first, published 1960), and SLIP used reference counting (seems to be the first programming language to use this technique, published 1963).

                                                1. 1

                                                  Does the divergence in garbage collection methods have anything to do with SLIP being a Fortran extension? I have to imagine the garbage collector was still implemented in machine code.

                                                  1. 2

                                                    Here’s the abstract for Weizenbaum’s paper:

                                                    Symmetric list processor; J Weizenbaum - Communications of the ACM, 1963; A list processing system in which each list cell contains both a forward and a backward link as well as a datum is described. This system is intended for imbedding in higher level languages capable of calling functions and subroutines coded in machine language. The presentation is in the form of FORTRAN programs depending on only a limited set of “primitive” machine language subroutines which are also defined. Finally, a set of field, particularly character, manipulation primitives are given to round out the system.

                                                    Based on this, it sounds like SLIP is implemented in machine language. This would have been normal and expected in 1963. I don’t have access to the full paper. Also note that in the ELIZAGEN article, this version of SLIP is embedded in MAD, not FORTRAN.

                                                1. 3

                                                  I stopped keeping paper notebooks when I realized I never referenced them again. I recently switched to an eink device that recognized handwriting. Now I have an append-only daily journal that can also be searched via grep.

                                                  As much as I like the idea of a paper notebook, I just have to accept it’s not how I work.

                                                  1. 1

                                                    Super happy with System 76 laptops. I’m interested in checking this one out.

                                                    1. 5

                                                      I am working on my Comparison Chart of Ada, C++ and Rust. The Rust segment is pretty empty, and I need to add generics and concurrency types for Ada, the Ada Sum Type example is incomplete (it should include a discriminated record definition), I need to fix some formatting issues, and there’s some other typos and such.

                                                      1. 1

                                                        It has been a long time since I’ve done anything in Ada. Can you name your top 2 or 3 things you feel that Ada still provides where Rust falls short?

                                                        As an aside, I’m forever in favor of := for assignment.

                                                        1. 3

                                                          If you haven’t tried Ada 2012, I’d recommend you try it, its a monstrous step up over Ada 2005.

                                                          Rust feels like a Haskell version of safer C++. Ada feels like a Pascal version of a simpler and safer C++.

                                                          1. Ada has much less symbology and has no specialized syntax for “classes” or “traits” namespaces.

                                                          The language feels very flat and conceptually uniform. You just get types and subprograms (functions/procedures) and you group your library code into syntactic units called packages. This means no class functions, static functions, free functions or associated types, related things are just in the same package and function overloading works with the type system and constraints on types to determine what gets called.

                                                          OOP is there if you want it, but multiple concrete parents are forbidden and “member functions” (methods) just look like normal subprograms with the first set of parameters being like “self”, with virtual functions using a “class-wide” parameter.

                                                          Generics can operate at the package (module) level, and can have requirements (e.g. “provide a function which I’ll call foo which takes two of generic type X”, “give me an integer type”) so ML-style signatures just naturally arise out of how generics work with how packages normally work. Generics also require explicit instantiation to a type name, so this means no .template or ::<> turbofish, and makes the costs of monomorphism explicit.

                                                          1. Rust focuses just on memory safety, whereas Ada focuses on correctness as a whole.

                                                          (I’m not saying Rust isn’t about correctness, or that it’s hard to write correct code in Rust–it’s not, it’s just that it doesn’t have some of these checks). Built-in preconditions/post-conditions, use-defined checks which automatically inject into code where types are used (type invariants and subtype predicates), ranges on values, access types (similar to pointers) are also themselves typed, do some lifetime checks, have their own run-time checks for validity and allocate from their own storage pools, and a more direct form of creating semantic versions of the same type (e.g. “Don’t let me assign meters to joules.”) in addition to it not allowing implicit float <-> int conversions. There’s also compiler pragmas for large-scale behavior like, “Ensure this package is entirely stateless.” With all of these controls I find Ada programs hard to get wrong.

                                                          Not everything is peachy, Ada is super opinionated on how it wants you to write code. You’ll code will end up much more verbose if you’re fighting the language.

                                                          1. 1

                                                            Thanks for the detailed reply. That last comment really caught my eye - “ensure this package is entirely stateless.” I am a little curious about how Ada provides all these options without being overwhelming or forcing too much ceremony. I’ll have to take a look at some code.

                                                            1. 2

                                                              Ada is a really deep language, but layered very flatly, in a way you can avoid many features unless and until you want to use them.

                                                              Aspects make this stuff built-in and compiler checked, many are still done (especially in older library code) with one line pragmas though. They can be applied to packages and also used for attributes of functions like “inline”, “preconditions/post-conditions”, making types as being able to be indexed (think operator[] in C++), and providing implicit deferencing (think operator* or operator-> in C++). This reduces specialized syntax.

                                                              Packages contain code, and Pure is an aspect (it originated as a pragma) you can attach aspects to packages in Ada 2012. Since Ada uses real dependency tracking, rather than a #include with preprocessor, verifying a package doesn’t use another impure package and is stateless itself just looks like:

                                                              package P
                                                                  with Pure
                                                              is
                                                              
                                                              end P;
                                                              
                                                      1. 1

                                                        The author covers Glitch and repl.it - so I suppose the definition of IDE is stretched enough for me to add two more candidates: Binder and Nextjournal. The notebook format is too often overlooked. And something like Nextjournal is pretty fast and flexible. These have many of the benefits of repl.it environments, but with a literate reading experience.