Grammatical Framework is a programming language designed for writing grammars, which has the capability of addressing several languages in parallel. This thorough introduction demonstrates how to write grammars in Grammatical Framework and use them in applications such as tourist phrasebooks, spoken dialogue systems, and natural language interfaces. The examples and exercises presented here address several languages, and the readers are shown how to look at their own languages from the computational perspective.
Implementing a programming language means bridging the gap from the programmer's high-level thinking to the machine's zeros and ones. If this is done in an efficient and reliable way, programmers can concentrate on the actual problems they have to solve, rather than on the details of machines. But understanding the whole chain from languages to machines is still an essential part of the training of any serious programmer. It will result in a more competent programmer, who will moreover be able to develop new languages. A new language is often the best way to solve a problem, and less difficult than it may sound. This book follows a theory-based practical approach, where theoretical models serve as blueprint for actual coding. The reader is guided to build compilers and interpreters in a well-understood and scalable way. The solutions are moreover portable to different implementation languages. Much of the actual code is automatically generated from a grammar of the language, by using the BNF Converter tool. The rest can be written in Haskell or Java, for which the book gives detailed guidance, but with some adaptation also in C, C++, C#, or OCaml, which are supported by the BNF Converter. The main focus of the book is on standard imperative and functional languages: a subset of C++ and a subset of Haskell are the source languages, and Java Virtual Machine is the main target. Simple Intel x86 native code compilation is shown to complete the chain from language to machine. The last chapter leaves the standard paths and explores the space of language design ranging from minimal Turing-complete languages to human-computer interaction in natural language.
Bhaswati Bhattacharya Chakrabarti is a Professor of Philosophy, University of North Bengal is well-conversant in both Indian and Western philosophy. Her books and articles particularly on Buddhism and Jayar"a·si are well accepted and appreciated by the scholarly world. She has lectured at length in different Universities of India and published papers in many professional journals published from India and abroad apart from few books. In 2004, she has visited Paris under Indo-French Cultural Exchange programme sponsored by I.C.P.R. in connection with her post-doctoral work on `Philosophy of Language ¿ East-West Dialogue¿. Recently she has co-edited »Sabdapram"a]na in Indian Philosophy.
This handbook provides an authoritative, critical survey of current research and knowledge in the grammar of the English language. The volume's expert contributors explore a range of core topics in English grammar, covering a range of theoretical approaches and including the relationship between 'core' grammar and other areas of language.
This textbook introduces and explains the fundamental issues, major research questions, and current approaches in the study of grammaticalization - the development of new grammatical forms from lexical items, and of further grammatical functions from existing grammatical forms. Grammaticalization has been a vibrant research field in recent years, and has proven effective in explaining a wide range of phenomena; it has even been claimed that the only true language universals are diachronic, and are related to cross-linguistic processes of grammaticalization. The chapters provide a detailed account of the major issues in the field: foundational questions such as directionality, criteria and parameters of grammaticalization, and phases and cycles; the much-debated issue of the motivations behind grammaticalization, including the role of language contact and typological influences; the advantages and disadvantages of different theoretical approaches; and the relationship between grammaticalization and process such as lexicalization, exaptation, and the development of discourse markers. Each chapter offers guidance on further reading, and concludes with study questions to encourage further discussion; there is also a glossary of key terminology in the field. Thanks to its comprehensive approach, the volume will serve as both a textbook for undergraduate and graduate students and a valuable reference work for researchers in the field.
Capitalizing on the by now widely accepted idea of the construction-specific and language-specific nature of grammatical relations, the editors of the volume developed a modern framework for systematically capturing all sorts of variations in grammatical relations. The central concepts of this framework are the notions of argument role and its referential properties, argument selector, as well as various conditions on argument selections. The contributors of the volume applied this framework in their descriptions of grammatical relations in individual languages and discussed its limitations and advantages. This resulted in a coherent description of grammatical relations in thirteen genealogically and geographically diverse languages based on original and extensive fieldwork on under-described languages. The volume presents a far more detailed picture of the diversity of argument selectors and effects of predicates, referential properties of arguments, as well as of various clausal conditions on grammatical relations than previously published grammatical descriptions.
This book proposes a novel approach for time-series prediction using machine learning techniques with automatic feature generation. Application of machine learning techniques to predict time-series continues to attract considerable attention due to the difficulty of the prediction problems compounded by the non-linear and non-stationary nature of the real world time-series. The performance of machine learning techniques, among other things, depends on suitable engineering of features. This book proposes a systematic way for generating suitable features using context-free grammar. A number of feature selection criteria are investigated and a hybrid feature generation and selection algorithm using grammatical evolution is proposed. The book contains graphical illustrations to explain the feature generation process. The proposed approaches are demonstrated by predicting the closing price of major stock market indices, peak electricity load and net hourly foreign exchange client trade volume. The proposed method can be applied to a wide range of machine learning architectures and applications to represent complex feature dependencies explicitly when machine learning cannot achieve this by itself. Industrial applications can use the proposed technique to improve their predictions.
Lexical Functional Grammar (LFG) is a nontransformational theory of linguistic structure, first developed in the 1970s by Joan Bresnan and Ronald M. Kaplan, which assumes that language is best described and modeled by parallel structures representing different facets of linguistic organization and information, related by means of functional correspondences. This volume has five parts. Part I, Overview and Introduction, provides an introduction to core syntactic concepts and representations. Part II, Grammatical Phenomena, reviews LFG work on a range of grammatical phenomena or constructions. Part III, Grammatical modules and interfaces, provides an overview of LFG work on semantics, argument structure, prosody, information structure, and morphology. Part IV, Linguistic disciplines, reviews LFG work in the disciplines of historical linguistics, learnability, psycholinguistics, and second language learning. Part V, Formal and computational issues and applications, provides an overview of computational and formal properties of the theory, implementations, and computational work on parsing, translation, grammar induction, and treebanks. Part VI, Language families and regions, reviews LFG work on languages spoken in particular geographical areas or in particular language families. The final section, Comparing LFG with other linguistic theories, discusses LFG work in relation to other theoretical approaches.
Head-Driven Phrase Structure Grammar (HPSG) is a constraint-based or declarative approach to linguistic knowledge, which analyses all descriptive levels (phonology, morphology, syntax, semantics, pragmatics) with feature value pairs, structure sharing, and relational constraints. In syntax it assumes that expressions have a single relatively simple constituent structure. This volume provides a state-of-the-art introduction to the framework. Various chapters discuss basic assumptions and formal foundations, describe the evolution of the framework, and go into the details of the main syntactic phenomena. Further chapters are devoted to non-syntactic levels of description. The book also considers related fields and research areas (gesture, sign languages, computational linguistics) and includes chapters comparing HPSG with other frameworks (Lexical Functional Grammar, Categorial Grammar, Construction Grammar, Dependency Grammar, and Minimalism).