So I’ve pretty much given up on Syntactic Structures. It wasn’t really as great as I thought it was. There was some linguistics jargon that I didn’t feel like learning, also. Before I stopped there was some interesting, yet unsurprising, things (especially for its time).
Chomsky talked about modeling languages using what he called (I think, I’m not actually looking it up) “finite state markov processes.” Apparently this was how linguists thought at the time. According to today’s standards, I’m not entirely sure he wanted to use the phrase “Markov process” as that usually implies randomly switching from state to state. Clearly when people speak, it isn’t random streams of words that come out (although it may seem that way sometimes).
Nevertheless, I assume that in today’s jargon he wanted to use “nondeterministic finite state automata” to model language, at least that is what his description was of. Now from basic theory of computation we know that if we could model a language using one of these, then it would have to consist entirely of regular expressions. No language consists only of regular expressions, thus the myth of being able to model in this way is debunked.
Chomsky obviously did not have that tool at his disposal, since he went on for pages considering different cases and why they wouldn’t work to conclude (in a pretty non-rigorous way) that you can’t model a language using an NFA (or DFA for that matter). Not surprising, but noteworthy I’d say. It really was a paradigm shift to claim that there is no way to have a grammar that can’t be modeled. Too many people are still trying to figure out how to do it. Things like online translators, AI, and many others work from the assumption that it is possible to get really close to being able to do it.
Since this little gem was in there, I feel like quitting is depriving me from some other interesting little tidbit that I hadn’t thought about, but oh well.