1. 5
    1. 6

      Available without the jstor paywall here, through Cambridge Press

    2. 2

      Some further sources can be found at the Stanford Encyclopedia of Philosophy. https://plato.stanford.edu/entries/goedel-incompleteness/#GdeArgAgaMec

      1. 1

        Thanks. The SEP analysis of why the argument fails is much better than the one I was going to put forth.

    3. 1

      This is a nonsensical argument

      1. 1


        1. 1

          Well, Godel’s Theorem has basically nothing to say whatsoever about the difference between brains and computers. Its just a statement about formal systems. Neither brains nor computers nor general mechanical objects are formal systems.

          1. 1

            But computational systems are formal systems. The essay is making the case against Mechanism: that brains are purely computational systems. I think the author’s argument is weird, but sound; a formal system should have certain properties and limitations, and the brain seems to neither have those properties nor be bounded by those limitations. In particular, we are conscious of inconsistencies.

            I think it’s a strange essay to read because most of us think that Mechanism isn’t true. I also think that trying to disprove Mechanism via Gödel is unusual. But, I think it’s clever, and I think the essay deserved a better rebuttal than “this is a nonsensical argument”.

            1. 1

              “Mechanism isn’t true” I think its most likely mechanism is true. I have some formal training as a neuroscientist (PhD) and I think most neuroscientists think mechanism is true.

    4. 1

      Ah, the same old argument. The first sentence is already giving it up: where is the argument that says that Gödel’s argument doesn’t apply to brains?

      1. 1

        The purpose of this paper is to make this exact argument. Lucas takes a few avenues, but the core one is this:

        However complicated a machine we construct, it will, if it is a machine, correspond to a formal system, which in turn will be liable to the G6del procedure for finding a formula unprovable-inthat-system. This formula the machine will be unable to produce as being true, although a mind can see that it is true. And so the machine will still not be an adequate model of the mind


        Godel’s theorem applies to deductive systems, and human beings are not confined to making only deductive inferences. Godel’s theorem applies only to consistent systems, and one may have doubts about how far it is permissible to assume that human beings are consistent. Godel’s theorem applies only to formal systems, and there is no a priori bound to human ingenuity which rules out the possibility of our contriving some replica of humanity which was not representable by a formal system. Human beings are not confined to making deductive inferences

        I don’t personally accept these arguments, but it’s not really right to imply that Lucas doesn’t make them in this paper.

        1. 1

          Thank you for the precisions. I did make a blanket statement, and I don’t agree with these arguments, but at least they’re mentioned.