This book is for anyone is interested in the relationship between grammar and vocabulary. The introduction looks at recent developments in corpus linguistics and second language acquisition research, and outlines the important role which chunks play in textual cohesion and in fluency, as well as in grammar acquisition. The practical part of the book provides practitioners with a large number of classroom suggestions and activities for making grammar teaching more lexical, and for making vocabulary practice more grammatical. Activities move from receptive to productive and can be used on their own or to supplement and enhance coursebook content.
The Cambridge Handbook of English Corpus Linguistics (CHECL) surveys the breadth of corpus-based linguistic research on English, including chapters on collocations, phraseology, grammatical variation, historical change, and the description of registers and dialects. The most innovative aspects of the CHECL are its emphasis on critical discussion, its explicit evaluation of the state of the art in each sub-discipline, and the inclusion of empirical case studies. While each chapter includes a broad survey of previous research, the primary focus is on a detailed description of the most important corpus-based studies in this area, with discussion of what those studies found, and why they are important. Each chapter also includes a critical discussion of the corpus-based methods employed for research in this area, as well as an explicit summary of new findings and discoveries.
This book describes an approach to lexis and grammar based on the concept of phraseology and of language patterning arising from work on large corpora. The notion of 'pattern' as a systematic way of dealing with the interface between lexis and grammar was used in Collins Cobuild English Dictionary (1995) and in the two books in the Collins Cobuild Grammar Patterns series (1996; 1998). This volume describes the research that led to these publications, and explores the theoretical and practical implications of the research. The first chapter sets the work in the context of work on phraseology. The next two chapters give several examples of patterns and how they are identified. Chapters 4 and 5 discuss and exemplify the association of pattern and meaning. Chapters 6, 7 and 8 relate the concept of pattern to traditional approaches to grammar and to discourse. Chapter 9 summarizes the book and adds to the theoretical discussion, as well as indicating the applications of this approach to language teaching. The volume is intended to contribute to the current debate concerning how corpora challenge existing linguistic theories, and as such will be of interest to researchers in the fields of grammar, lexis, discourse and corpus linguistics. It is written in an accessible style, however, and will be equally suitable for students taking courses in those areas.
Lexical-Functional Grammar was first developed by Joan Bresnan and Ronald M. Kaplan in the late 1970s, and was designed to serve as a medium for expressing and explaining important generalisations about the syntax of human languages and thus to serve as a vehicle for independent linguistic research. An equally important goal was to provide a restricted, mathematically tractable notation that could be interpreted by psychologically plausible and computationally efficient processing mechanisms. The formal architecture of LFG provides a simple set of devices for describing the common properties of all human languages and the particular properties of individual languages. This volume presents work conducted over the past several years at the Xerox Palo Alto Research Center, Stanford University, and elsewhere. The different sections link mathematical and computational issues and the analysis of particular linguistic phenomena in areas such as wh-constructions, anaphoric binding, word order and coordination.
This introduction to and overview of the "glue" approach is the first book to bring together the research of the major contributors to the field. A new, deductive approach to the syntax-semantics interface integrates two mature and successful lines of research: logical deduction for semantic composition and the Lexical Functional Grammar (LFG) approach to the analysis of linguistic structure. It is often referred to as the "glue" approach because of the role of logic in "gluing" meanings together. The "glue" approach has attracted significant attention from, among others, logicians working in the relatively new and active field of linear logic; linguists interested in a novel deductive approach to the interface between syntax and semantics within a nontransformational, constraint-based syntactic framework; and computational linguists and computer scientists interested in an approach to semantic composition that is grounded in a conceptually simple but powerful computational framework.This introduction to and overview of the "glue" approach is the first book to bring together the research of the major contributors to the field. Contributors Richard Crouch, Mary Dalrymple, John Fry, Vineet Gupta, Mark Johnson, Andrew Kehler, John Lamping, Dick Oehrle, Fernando Pereira, Vijay Saraswat, Josef van Genabith
Presents an overview and introduction to Lexical Functional Grammar (LFG), a theory of the content and representation of different aspects of linguistic structure and the relations that hold between them. This book also presents a theory of semantics and the syntax-semantics interface.
With this textbook, Yehuda N. Falk provides an introduction to the theory of Lexical-Functional Grammar, aimed at both students and professionals who are familiar with other generative theories and now wish to approach LFG. Falk examines LFG's relation to more conventional theories—like Government/Binding or the Minimalism Program—and, in many respects, establishes its superiority.
A lexically based, corpus-driven theoretical approach to meaning in language that distinguishes between patterns of normal use and creative exploitations of norms. In Lexical Analysis, Patrick Hanks offers a wide-ranging empirical investigation of word use and meaning in language. The book fills the need for a lexically based, corpus-driven theoretical approach that will help people understand how words go together in collocational patterns and constructions to make meanings. Such an approach is now possible, Hanks writes, because of the availability of new forms of evidence (corpora, the Internet) and the development of new methods of statistical analysis and inferencing. Hanks offers a new theory of language, the Theory of Norms and Exploitations (TNE), which makes a systematic distinction between normal and abnormal usage—between rules for using words normally and rules for exploiting such norms in metaphor and other creative use of language. Using hundreds of carefully chosen citations from corpora and other texts, he shows how matching each use of a word against established contextual patterns plays a large part in determining the meaning of an utterance. His goal is to develop a coherent and practical lexically driven theory of language that takes into account the immense variability of everyday usage and that shows that this variability is rule governed rather than random. Such a theory will complement other theoretical approaches to language, including cognitive linguistics, construction grammar, generative lexicon theory, priming theory, and pattern grammar.
Do you hate teaching some aspects of grammar? Do you ever feel frustrated that your students just don’t get it? Well, in Grammar Nonsense, Andrew Walkley and Hugh Dellar argue that you shouldn’t really blame yourself. The fault lies largely with the way grammar rules and methods have been passed down through training and published material and become established as the way of doing things: a straightjacket that we need to escape from. Through an entertaining series of rants and meditations on all things grammatical, from the use of the word grammar to the horror of teaching verb patterns, they aim to pull apart rules which we give without thinking and to question approaches to practice that are seen as a must. Along the way, you’ll not only learn how published materials get written and about ideas such as the transformation fallacy and grammar olives, but you’ll also get plenty of practical suggestions as to what to do about all this nonsense.