Generative linguistics is the dominant school of linguistics in the Western world today, and it first emerged when linguist-cum-dissident Noam Chomsky published Syntactic Structures in 1957. Before then, the dominant school of linguistics in North America was structural linguistics.

The distinction is primarily in how sentences are formed. Structural linguists saw sentences as being strings of concepts connected together to form a complete thought. Generative linguists believe that any given language's sentence structures can be summarized by a collection of empirical rules, by seeing which sentences sound natural to a native speaker and which do not. In theory, once these rules are determined, it should be possible for anyone—even an AI—to create grammatically correct sentences in any language, and to break any grammatically correct sentence down into its core meaning.

But again then... jumble even if I my words, understand you with difficulties of minimal... zuh?

Yeah, that's what I thought. It's an excellent theory on paper, and works when you confine yourself to professionally-written prose, but it doesn't hold water in real life, where people can say all sorts of semi-incoherent babble and still be understood. This is how cognitive linguistics defends itself.