There is a well-known classification of four varieties of grammars, differing in complexity, from unlimited to regular. These grammars correspond to four classes of automata in computer science:
Regular grammars correspond to finite automata;
Context-free grammars correspond to push-down automata;
Context-sensitive grammars correspond to linearly bounded automata;
Phrase-structured (unrestricted) grammars correspond to Turing machines.
IMO, this correspondence is astonishing and its consequences have not yet been fully revealed; except for this correspondence, I know of only one similar one - the Curry-Howard-Lambek correspondence (between computer science, logic and category theory).
However, Chomsky then makes a somewhat vague statement: he claims that the ability of "recursion" in language (I don't quite understand what is meant - maybe general recursion?) is inherent in the human genes and that other animals do not have this ability. With that in mind, I have 3 questions:
If by "recursion" Chomsky means that some class of grammars is capable of it, which one? Perhaps unlimited one or... what?
In order for recursion to manifest itself in a certain system, it only takes a simple feedback loop! However, this doesn't require any special genes or unique designs of the human brain - even in the most primitive nematodes, feedback loops do exist.
If there really are language constructs that humans can understand, but, for example, dolphins and monkeys cannot, then what are those constructs? What specifically "recursive" expressions are we talking about? (Or does Chomsky simply want to arbitrarily single out people as a separate group?).
Thanks in advance.