1. 12
  1.  

  2. 5

    It’s quite interesting that they can finally confirm it.

    However, aren’t all skills learned anyway?

    Let’s say that you would perform this quite troubling scientific experiment; you would leave a toddler alone for 10 years, completely isolated, and nobody would talk to him during all those years. It’s pretty obvious that we wouldn’t be able to speak at all.

    1. 9

      Good question.

      Chomsky’s famous “deep structures” idea (I hope I’m not mangling it too badly) stated that language, in particular, involved innate skills at some level of abstraction. So, no, there was not agreement among linguists that everything about language-use had to be learned.

      Though obviously any specific language has to be learned, the notion was that the structure of the human brain is such that certain basics common to all languages were common. As a non-linguist, I think this would be things like the existence of nouns and verbs (but not of other parts of speech), and other general patterns in the structures of languages.

      Of course, the existence of those commonalities does not itself mean that there’s any innate knowledge of them. Linguistics has done better than many other sciences at looking at diverse cultural backgrounds rather than guessing about universal truth from an unintentionally narrow sample, but it’s still possible that the commonalities emerge from social factors. It’s also possible that they emerge directly from the problem domain - that they represent the best way to describe the world, if you happen to be a physical object.

      1. 9

        The poverty of the stimulus debate is an interesting aspect of the argument. Chomskyians argue that young kids learn grammar too rapidly and from too few examples to support a view that language learning is blank-slate learning using some kind of general information-processing/induction capability. Instead people in this camp think it must be a kind of parameter learning, where kids are not really learning the grammars per se, but have the grammar building blocks “built in” and are learning parameters on them. Which, the argument goes, explains why they can rapidly generate complex structures after “training” on only small numbers of examples.

        Counter-arguments are of various kinds, which I haven’t kept up with enough to summarize accurately. But one broad class is to agree that there is some kind of inductive bias (essentially all successful learning algorithms have some kind of inductive bias), but disagree with the Chomskyian parameter-fitting model and/or the Chomskyian hypothesis that it’s language-specific.

      2. 6

        Let’s say that you would perform this quite troubling scientific experiment; you would leave a toddler alone for 10 years, completely isolated, and nobody would talk to him during all those years.

        Unfortunately, this has happened before. Not as a science experiment, but through parental abuse and neglect.

        1. 5

          Fredrick II didn’t exactly do it as a science experiment in the sense that he wasn’t controlling variables, etc., but he tried to do the same thing on purpose,

        2. 3

          However, aren’t all skills learned anyway?

          Not true with animals, anyway.

          This happens all the time on a farm. An animal is born and never sees it’s parent or siblings or other animals of same species (parent dies, or whatever). The animal will have many behaviors that are exactly like if it had been raised with animals of its species.

        3. 3

          I don’t buy the inference they make from their experiment results. Language is of course always learned, and it’s equally observed that children build up from simple to advanced grammatical constructs over time, and tend to follow broadly similar patterns in terms of which aspects they learn in what order. That has been known for decades, if not centuries. If there is an innate language module in the brain, it could show the exact same patterns during language acquisition as observed here.