1. 33
  1.  

  2. 6

    I’m not sure when I would use this but I love it when people use languages in ways the authors hadn’t intended.

    1. 16

      Regardless of the practicality of this approach, this exercise found many bugs in the new Go generics implementation (most of which are now fixed) and opened several questions into their semantics.

      1. 2

        … this exercise found many bugs in the new Go generics implementation (most of which are now fixed) and opened several questions into their semantics.

        Can you provide a link to the related discussion?

      2. 4

        “Here is an extremely convoluted way to make C kinda look sorta like C++ but without the additional type safety.”

        “What’s the advantage?”

        “It’s not C++.”

        “SOLD!”

      3. 4

        Direct link to code (with commentary): https://play.golang.org/p/83fLiHDTSdY (and on the Go2go Playground: https://go2goplay.golang.org/p/83fLiHDTSdY)

        I think I see what he’s doing, building up an interpreter DSL, but man, I have no idea what most of his commentary means. For example:

        I exploit the duality between universal and existential quantification and encode sum types using the existential dual of the Boehm-Berarducci isomorphism.

        1. 11

          Well it’s pretty simple, really. Types are all about the composition of things, so the question is how do we compose what we have (language primitives) to get what we want. We can guess, try things out, but if that doesn’t work out we can use math to derive the result we seek, because these are all simple settled questions in math, and due to the Curry–Howard corespondence we should be able to implement them in our language.

          Alonzo Church showed us how to encode anything into (untyped) lambda calculus, and indeed, we can encode variants using Church encoding like so:

          a+b ≡ λa b f . f a b
          fst ≡ λt . t (λa b . a)
          snd ≡ λt . t (λa b . b)
          

          a+b takes three arguments, but we only supply two when we create it. Then a+b is a pending computation, and fst and snd extract the first or second component by supplying the continuation that tells the sum type what to do!

          Unfortunately, this is not quite what we want, because untyped lambda calculus is untyped, and the whole point of the exercise is to be typed so this encoding won’t cut it. But it’s ok, this captures the procedural aspect of the idea, it tells us what we have to do with the data, but not with the types. Now we have to find a way to determine a typeful encoding.

          Well, it’s pretty easy if we use math. A type like a can always be written like ∀z . (a→z) → z. But we are interested in a+b, so we just plug-in a+b, and we do some trivial algebra and we get:

          ∀z . (a+b →z) →z = ∀z . ((a→z) * (b→z)) →z
          

          This looks suspiciously similar to Church encoding, we can see the two continuations. However, this is a type equation. It doesn’t tells us anything what to do with the values, it’s just a relation between types. But we know what to do with the values from the CPS interpretation of the Church encoding, and now we can implement this in any language that has parametric polymorphism. This is the road to the Boehm-Berarducci encoding, which expands on this idea (except that it goes a step further and encodes it as ∀z . (a→z) → (b→z) →z).

          We can implement this in Haskell.

          {-# LANGUAGE ExistentialQuantification #-}
          {-# LANGUAGE RankNTypes #-}
          
          import Text.Printf
          
          data Variants a b z = Variants (a->z) (b->z)
          type Sum a b = (forall z. Variants a b z -> z)
          
          plus :: Int -> Sum Int Int
          plus x (Variants f _) = f x
          minus :: Int -> Sum Int Int
          minus x (Variants _ f) = f x
          
          neg :: Sum Int Int -> Sum Int Int
          neg v = v (Variants (\x -> minus x) (\x -> plus x))
          
          foo = plus 42
          bar = minus 16
          baz = neg foo
          
          example :: Sum Int Int -> String
          example v = v (Variants
              (\x -> printf "got a plus %d" $ x)
              (\x -> printf "got a minus %d" $ x))
          
          main = do
              putStrLn (example foo)
              putStrLn (example bar)
              putStrLn (example baz)
          

          Prints:

          : z800:aram; a
          got a plus 42
          got a minus 16
          got a minus 42
          

          However, we can’t implement this in Go. Until very recently, Go didn’t have parametric polymorphism at all, and even so, notice that the forall in Haskell is inside the type. We need rank-2 polymorphism which Go doesn’t have. But the fact that we had to enable ExistentialQuantification in Haskell should give us a hint.

          Go interfaces are existential types. In most (all?) languages based on System F/HM existential types are encoded as rank-n universals, but Go is unusual in that Go has always had existential types even before it had parametric polymorphism.

          But universal quantification in logic is dual to existential quantification! We can always switch back and forth between the two (at least in classical logic, but because of intentional limitations of the Go type system, the excluded middle axiom should always hold, so we should be ok here).

          So we can just translate any statement using universal quantifiers into its dual that uses only existential quantifiers, basically through double negation.

          ∀x P(x) ≡ ¬∃x ¬P(x)
          

          So we just do that, and we get what’s implemented in my program!

          Notice that I make the above transformation in logic, not in Go. If I’d made it in Go I would have to assume that the Curry–Howard isomorphism holds in the Go type system, which it may or it might not depending on the semantics of its generics and interfaces. I do not make this assumption. I do not assume we can use that axiom inside Go, but of course, we can use it in the meta-theory.

          I explained all of this starting from Church encoding, going though Boehm-Berarducci, etc, but that’s not how I discovered this at all. I was trying to formalize Go generics, and equations such as the ones above popped out as intermediary results. I then immediately saw the potential of this, and trying to make sure that it works, and that I understood it, I started from the existential representation, and transformed it into the Boehm-Berarducci encoding (which I wasn’t even aware it existed when I started this exercise).

          I only considered ADTs (not GADTs) in this post, but I hope this at least gives you some intuition.

          Hope this helps.

          1. 2

            I appreciate you taking time with this answer, and I did read it all, but unfortunately I just don’t have the math and computer science background to comprehend this. Back to school for me!

            1. 2

              This explanation would be more effective for me and other laypeople if you include a brief tutorial on the math notation you use.

              It’s easier to pick up the Haskell than it is to pick up the math notation.

              1. 2

                Alonzo Church showed us how to encode anything into (untyped) lambda calculus, and indeed, we can encode variants using Church encoding like so:

                a+b ≡ λa b f . f a b
                fst ≡ λt . t (λa b . a)
                snd ≡ λt . t (λa b . b)
                

                a+b takes three arguments, but we only supply two when we create it. Then a+b is a pending computation, and fst and snd extract the first or second component by supplying the continuation that tells the sum type what to do!

                Something is off here. That’s the Church encoding for products, not variants/sums.

                1. 1

                  maybe because of this equivalence?

                  ∀z . (a+b →z) →z = ∀z . ((a→z) * (b→z)) →z

                  1. 2

                    It’s true that destructing a sum amounts to providing a product of handlers, one for each case. But that’s not how OP explained it. OP said the provided Church encoding was for variants/sums, which it isn’t, and then didn’t seem to use it at all in the Haskell code, since the Variants constructor is already a product of exponentials and doesn’t need to be Church-encoded. I think it’s just a mistake. It doesn’t detract from the Go demonstration, which is really cool. But Church encoding explanation is wrong, and I can see why people in these comments are confused.

                  2. 1

                    Thanks! That is what happens when you didn’t sleep for 24 hours. It should have been:

                    c₁ ≡ λt f₁ f₂ . f₁ t
                    c₂ ≡ λt f₁ f₂ . f₂ t
                     m ≡ λt f₁ f₂ . t f₁ f₂
                    

                    e.g for usage: swap ≡ λt . m t c₂ c₁

                    (I can’t edit the mistake in the original post.)

                2. 5

                  It’s like casting a spell. ‘I invoke the elemental magics of Yogg-Zonnroth, and cast the energies of Zum-ra into the ether!’

                  1. 1

                    Boehm-Berarducci encoding

                    Oleg has a good paper about this. Basically you can translate all A(lgebraic)DT operations (construction and destruction (aka pattern-matching)) into lambdas and applications of lambdas, meaning you can embed ADTs in languages that do not have them.

                    1. 1

                      The code fails to run in both playgrounds … I’m guessing they are not using the right golang version ?

                      1. 1

                        Yes, it only works in the latest version of the compiler.

                    2. 1

                      Only tangentially related, but am I to infer from this that generics have in fact been merged to tip, and will presumably be in the next major release? Looking forward to it.

                      1. 1

                        They are in heavy development and will most likely be available in the next version of Go, Go 1.18, as a preview (not finalized) feature with finalization happening in Go 1.19.

                        You can try them today if you use tip, but they are in heavy flux.