1. 11
  1. 4

    This looks like a great introduction to classical relevant logic using Boolean algebra.

    1. 1

      I’m confused about the last paragraph. Why is it not possible from a contradiction such as P and ~P to derive any proposition X?

      1. 3

        This particular fragment of logic is “relevant”, or it is a relevant logic; the only formulas that can be introduced are those which are related to existing premises somehow.

        To see the issue clearly, suppose I said something like, “If 2+2=5, then pigs can fly.” This is a material implication whose premise is false, so classically it is true. Or, as Wikipedia suggests, I might say “If I am a pig, then 2+2=4.” This is also true. But pigs are not relevant to arithmetic, so both of these utterances are a sort of nonsense which should be forbidden. We can forbid them by requiring any new proposition X to be related to existing premises like P and ~P. A contradiction can only be used to refute or prove facts which are related to the premise, and there is not a principle of explosion.

      2. 1

        Now do linear logic

        1. 6

          Classical linear logic is not much different. We’ll skip the “exponential modalities”; this means that we must use every one of our premises exactly once. The axioms are different, too, because linear logic has two different kinds each of conjunction and disjunction (four connectives total.) The article’s trick for changing implication into disjunction would still work, but it would always give one kind of disjunction, called the “multiplicative” disjunction. (The other kind is called “additive”.)

          We won’t have Boolean algebra, but C* algebra. It’s no longer about what is true. I don’t know if you’re familiar with the vending-machine interpretation of linear logic, which might make it easier to understand exactly what is being computed. A premise is not a formula for a hypothetical truth, but a recipe for a hypothetical physical resource which cannot be copied or dropped: a molecule, a can of soda, a shipping container, a region of memory, ingredients in a cooking recipe, etc. The connectives could be interpreted as follows:

          • Additive conjunction (“with”): you have two recipes for two resources, and just enough ingredients to use exactly one recipe; you choose which one to use
          • Multiplicative conjunction (“and”): you have two resources and must use both of them
          • Additive disjunction (“plus”): you have either of two resources, and must have two recipes (one for each case) before you can try either case; you don’t choose which one to use
          • Multiplicative disjunction (“par”): you have either of two resources, and you must prove that one case is impossible before you can obtain the resource directly from the other case

          That last one is weird, but it’s connected to implication: having a resource is like the negation of needing a resource, so an alternative way to look at multiplicative disjunction is that you have a recipe which needs a resource, and using the recipe will allow you to have a resource; you have a recipe which transforms one resource into another resource. This is “linear implication”.

          1. 2

            The disjunctive cases remind me a bit of algebraic datatypes. Additive disjunction corresponds to destructing a datatype by handling all possible constructors; multiplicative disjunction corresponds to pulling out the parameters to a constructor by proving that that’s the constructor for the value you actually have.