Structure

Concepts, Consequences, Interactions

Natural phenomena, including human language, are not just series of events but are organized quasi-periodically; sentences have structure, and that structure matters.

Howard Lasnik and Juan Uriagereka “were there” when generative grammar was being developed into the Minimalist Program. In this presentation of the universal aspects of human language as a cognitive phenomenon, they rationally reconstruct syntactic structure. In the process, they touch upon structure dependency and its consequences for learnability, nuanced arguments (including global ones) for structure presupposed in standard linguistic analyses, and a formalism to capture long-range correlations. For practitioners, the authors assess whether “all we need is Merge,” while for outsiders, they summarize what needs to be covered when attempting to have structure “emerge.”

Reconstructing the essential history of what is at stake when arguing for sentence scaffolding, the authors cover a range of larger issues, from the traditional computational notion of structure (the strong generative capacity of a system) and how far down into words it reaches to whether its variants, as evident across the world’s languages, can arise from non-generative systems. While their perspective stems from Noam Chomsky’s work, it does so critically, separating rhetoric from results. They consider what they do to be empirical, with the formalism being only a tool to guide their research (of course, they want sharp tools that can be falsified and have predictive power). Reaching out to skeptics, they invite potential collaborations that could arise from mutual examination of one another’s work, as they attempt to establish a dialogue beyond generative grammar.
Howard Lasnik is Distinguished University Professor of Linguistics at the University of Maryland. Juan Uriagereka is Professor of Linguistics and Director of the School of Languages, Literatures, & Cultures at the University of Maryland.
Preface ix
1 Investigating Structure 1
2 Learnability Matters 37
3 Locality and Beyond 61
4 Reducing Reduced Phrase Markers 99
5 Structural Variation, Language Acquisition, and Machine Learning 135
6 Conclusions and Future Research 171
Notes 183
References 207
Index 225

About

Natural phenomena, including human language, are not just series of events but are organized quasi-periodically; sentences have structure, and that structure matters.

Howard Lasnik and Juan Uriagereka “were there” when generative grammar was being developed into the Minimalist Program. In this presentation of the universal aspects of human language as a cognitive phenomenon, they rationally reconstruct syntactic structure. In the process, they touch upon structure dependency and its consequences for learnability, nuanced arguments (including global ones) for structure presupposed in standard linguistic analyses, and a formalism to capture long-range correlations. For practitioners, the authors assess whether “all we need is Merge,” while for outsiders, they summarize what needs to be covered when attempting to have structure “emerge.”

Reconstructing the essential history of what is at stake when arguing for sentence scaffolding, the authors cover a range of larger issues, from the traditional computational notion of structure (the strong generative capacity of a system) and how far down into words it reaches to whether its variants, as evident across the world’s languages, can arise from non-generative systems. While their perspective stems from Noam Chomsky’s work, it does so critically, separating rhetoric from results. They consider what they do to be empirical, with the formalism being only a tool to guide their research (of course, they want sharp tools that can be falsified and have predictive power). Reaching out to skeptics, they invite potential collaborations that could arise from mutual examination of one another’s work, as they attempt to establish a dialogue beyond generative grammar.

Author

Howard Lasnik is Distinguished University Professor of Linguistics at the University of Maryland. Juan Uriagereka is Professor of Linguistics and Director of the School of Languages, Literatures, & Cultures at the University of Maryland.

Table of Contents

Preface ix
1 Investigating Structure 1
2 Learnability Matters 37
3 Locality and Beyond 61
4 Reducing Reduced Phrase Markers 99
5 Structural Variation, Language Acquisition, and Machine Learning 135
6 Conclusions and Future Research 171
Notes 183
References 207
Index 225