An exploration of human language from the perspective of the natural sciences, this outstanding book brings together leading specialists to discuss the scientific connection of language to disciplines such as mathematics, physics, chemistry and biology.
The view that language is in some way 'arbitrary', that there is no formal relationship between a linguistic message and the thought it is meant to convey, is long established and pervasive. The goal of John Haiman's study is to challenge the monopoly of arbitrariness, which he believes has affected in significant ways many models of linguistic description and analysis, notably those proposed by Saussure and more recently by Chomsky and his associates. Linguistic structures, Dr Hainian claims, may be compared to (non-linguistic) diagrams of our thoughts, and deviate from iconicity in many of the same ways and for much the same reasons as do diagrams in general. Arbitrariness develops as a result of the relatively familiar principles of economy, generalization and association. In relation to this thesis, Dr Haiman considers a wide variety of constructions, including conditionals and interrogatives, gapping, causative structures, auxiliaries and reflexives, and provides a wealth of exemplification from different languages that also points to typological differences in respect of iconicity.
Many NLP tasks have at their core a subtask of extracting the dependencies—who did what to whom—from natural language sentences. This task can be understood as the inverse of the problem solved in different ways by diverse human languages, namely, how to indicate the relationship between different parts of a sentence. Understanding how languages solve the problem can be extremely useful in both feature design and error analysis in the application of machine learning to NLP. Likewise, understanding cross-linguistic variation can be important for the design of MT systems and other multilingual applications. The purpose of this book is to present in a succinct and accessible fashion information about the morphological and syntactic structure of human languages that can be useful in creating more linguistically sophisticated, more language-independent, and thus more successful NLP systems. Table of Contents: Acknowledgments / Introduction/motivation / Morphology: Introduction / Morphophonology / Morphosyntax / Syntax: Introduction / Parts of speech / Heads, arguments, and adjuncts / Argument types and grammatical functions / Mismatches between syntactic position and semantic roles / Resources / Bibliography / Author's Biography / General Index / Index of Languages
Natural language processing (NLP) is a scientific discipline which is found at the interface of computer science, artificial intelligence and cognitive psychology. Providing an overview of international work in this interdisciplinary field, this book gives the reader a panoramic view of both early and current research in NLP. Carefully chosen multilingual examples present the state of the art of a mature field which is in a constant state of evolution. In four chapters, this book presents the fundamental concepts of phonetics and phonology and the two most important applications in the field of speech processing: recognition and synthesis. Also presented are the fundamental concepts of corpus linguistics and the basic concepts of morphology and its NLP applications such as stemming and part of speech tagging. The fundamental notions and the most important syntactic theories are presented, as well as the different approaches to syntactic parsing with reference to cognitive models, algorithms and computer applications.
A novel logic-based framework for representing the syntax-semantics interface of natural language, applicable to a range of phenomena. In this book, Yusuke Kubota and Robert Levine propose a type-logical version of categorial grammar as a viable alternative model of natural language syntax and semantics. They show that this novel logic-based framework is applicable to a range of phenomena—especially in the domains of coordination and ellipsis—that have proven problematic for traditional approaches. The type-logical syntax the authors propose takes derivations of natural language sentences to be proofs in a particular kind of logic governing the way words and phrases are combined. This logic builds on and unifies two deductive systems from the tradition of categorial grammar; the resulting system, Hybrid Type-Logical Categorial Grammar (Hybrid TLCG) enables comprehensive approaches to coordination (gapping, dependent cluster coordination, and right-node raising) and ellipsis (VP ellipsis, pseudogapping, and extraction/ellipsis interaction). It captures a number of intricate patterns of interaction between scopal operators and seemingly incomplete constituents that are frequently found in these two empirical domains. Kubota and Levine show that the hybrid calculus underlying their framework incorporates key analytic ideas from competing approaches in the generative syntax literature to offer a unified and systematic treatment of data that have posed considerable difficulties for previous accounts. Their account demonstrates that logic is a powerful tool for analyzing the deeper principles underlying the syntax and semantics of natural language.
Ever since Chomsky laid the framework for a mathematically formal theory of syntax, two classes of formal models have held wide appeal. The finite state model offered simplicity. At the opposite extreme numerous very powerful models, most notable transformational grammar, offered generality. As soon as this mathematical framework was laid, devastating arguments were given by Chomsky and others indicating that the finite state model was woefully inadequate for the syntax of natural language. In response, the completely general transformational grammar model was advanced as a suitable vehicle for capturing the description of natural language syntax. While transformational grammar seems likely to be adequate to the task, many researchers have advanced the argument that it is "too adequate. " A now classic result of Peters and Ritchie shows that the model of transformational grammar given in Chomsky's Aspects [IJ is powerful indeed. So powerful as to allow it to describe any recursively enumerable set. In other words it can describe the syntax of any language that is describable by any algorithmic process whatsoever. This situation led many researchers to reasses the claim that natural languages are included in the class of transformational grammar languages. The conclu sion that many reached is that the claim is void of content, since, in their view, it says little more than that natural language syntax is doable algo rithmically and, in the framework of modern linguistics, psychology or neuroscience, that is axiomatic.
In this daring book, the author proposes that artistic and literary forms can be understood as modulations of wave forms in the physical world. By the phrase "natural syntax," he means that physical nature enters human communication literally by way of a transmitting wave frequency. This premise addresses a central question about symbolism in this century: How are our ideas symbolically related to physical reality? The author outlines a theory of communication in which nature is not reached by reference to an object; rather, nature is part of the message known only tacitly as the wavy carrier of a sign or signal. One doesn't refer to nature, even though one might be aiming to; one refers with nature as carrier vehicle. The author demonstrates that a natural language of transmission has an inherent physical syntax of patterned wave forms, which can also be described as certain "laws of form"a phrase used by D'Arcy Thompson, L. L. Whyte, Noam Chomsky, and Stephen Jay Gould. He describes a syntax inherent in natural languages that derives from the rhythmic form of a propelling wave. Instead of the "laws" of a wave's form, however, the author speaks of its elements of rhythmic composition, because "rythmos" means "wave" in Greek and because "composition" describes the creative process across the arts. In pursuing a philosophy of rhythmic composition, the author draws on cognitive science and semiotics. But he chiefly employs symmetry theory to describe the forms of art, and especially the patterns of poetry, as structures built upon the natural syntax of wave forms. Natural syntax, it turns out, follows a fascinating group of symmetry transformations that derive from wave forms.
By formalizing recent syntactic theories for natural languages Stabler shows how their complexity can be handled without guesswork or oversimplification. By formalizing recent syntactic theories for natural languages in the tradition of Chomsky's Barriers, Stabler shows how their complexity can be handled without guesswork or oversimplification. He introduces logical representations of these theories together with special deductive techniques for exploring their consequences that will provide linguists with a valuable tool for deriving and testing theoretical predictions and for experimenting with alternative formulations of grammatical principles. Stabler's novel approach allows results to be deduced with straightforward calculations and provides a systematic framework for tackling the problem of how speakers can infer the properties of an utterance from principles of the grammar. The special treatment of equality, induction principles, and inclusion of a general method for collecting structures from proofs means that sophisticated linguistic arguments can be carried out in detail, giving a rich perspective to issues in linguistic theory and parsing.