Information structure and information theory

Information structure and information theory

Author: Robin Lemke

Publisher: Language Science Press

Published: 2024-09-19

Total Pages: 248

ISBN-13: 3961104816

DOWNLOAD EBOOK

This volume results from the workshop "Discourse obligates – How and why discourse limits the way we express what we express" at the 44th Annual Meeting of the German Linguistic Society in Tübingen, Germany. The workshop brought - and this book brings - together information-structural and information-theoretic perspectives on optional variation between linguistic encodings. Previously, linguistic phenomena like linearization, the choice between syntactic constructions or the distribution of ellipsis have been investigated from an information-structural or information-theoretic perspective, but the relationship between these approaches remains underexplored. The goal of this book is to look more in detail into how information structure and information theory contribute to explaining linguistic variation, to what extent they explain different encoding choices and whether they interact in doing so. Using experimental and corpus-based methods, the contributions investigate this on different languages, historical stages and levels of linguistic analysis.


Information structure and information theory

Information structure and information theory

Author: Robin Lemke

Publisher: BoD – Books on Demand

Published: 2024-09-19

Total Pages: 250

ISBN-13: 3985541108

DOWNLOAD EBOOK

This volume results from the workshop "Discourse obligates – How and why discourse limits the way we express what we express" at the 44th Annual Meeting of the German Linguistic Society in Tübingen, Germany. The workshop brought - and this book brings - together information-structural and information-theoretic perspectives on optional variation between linguistic encodings. Previously, linguistic phenomena like linearization, the choice between syntactic constructions or the distribution of ellipsis have been investigated from an information-structural or information-theoretic perspective, but the relationship between these approaches remains underexplored. The goal of this book is to look more in detail into how information structure and information theory contribute to explaining linguistic variation, to what extent they explain different encoding choices and whether they interact in doing so. Using experimental and corpus-based methods, the contributions investigate this on different languages, historical stages and levels of linguistic analysis.


Information Structure and Sentence Form

Information Structure and Sentence Form

Author: Knud Lambrecht

Publisher: Cambridge University Press

Published: 1996-11-13

Total Pages: 406

ISBN-13: 1316582418

DOWNLOAD EBOOK

Why do speakers of all languages use different grammatical structures under different communicative circumstances to express the same idea? Professor Lambrecht explores the relationship between the structure of the sentence and the linguistic and extra-linguistic context in which it is used. His analysis is based on the observation that the structure of a sentence reflects a speaker's assumption about the hearer's state of knowledge and consciousness at the time of the utterance. This relationship between speaker assumptions and formal sentence structure is governed by rules and conventions of grammar, in a component called 'information structure'. Four independent but interrelated categories are analysed: presupposition and assertion, identifiability and activation, topic, and focus.


Structural Information Theory

Structural Information Theory

Author: Emanuel Leeuwenberg

Publisher: Cambridge University Press

Published: 2013

Total Pages: 337

ISBN-13: 1107029600

DOWNLOAD EBOOK

A coherent and comprehensive theory of visual pattern classification with quantitative models, verifiable predictions and extensive empirical evidence.


Entropy and Information Theory

Entropy and Information Theory

Author: Robert M. Gray

Publisher: Springer Science & Business Media

Published: 2013-03-14

Total Pages: 346

ISBN-13: 1475739826

DOWNLOAD EBOOK

This book is devoted to the theory of probabilistic information measures and their application to coding theorems for information sources and noisy channels. The eventual goal is a general development of Shannon's mathematical theory of communication, but much of the space is devoted to the tools and methods required to prove the Shannon coding theorems. These tools form an area common to ergodic theory and information theory and comprise several quantitative notions of the information in random variables, random processes, and dynamical systems. Examples are entropy, mutual information, conditional entropy, conditional information, and discrimination or relative entropy, along with the limiting normalized versions of these quantities such as entropy rate and information rate. Much of the book is concerned with their properties, especially the long term asymptotic behavior of sample information and expected information. This is the only up-to-date treatment of traditional information theory emphasizing ergodic theory.


Information Theory of Molecular Systems

Information Theory of Molecular Systems

Author: Roman F Nalewajski

Publisher: Elsevier

Published: 2006-03-31

Total Pages: 463

ISBN-13: 0080459749

DOWNLOAD EBOOK

As well as providing a unified outlook on physics, Information Theory (IT) has numerous applications in chemistry and biology owing to its ability to provide a measure of the entropy/information contained within probability distributions and criteria of their information "distance" (similarity) and independence. Information Theory of Molecular Systems applies standard IT to classical problems in the theory of electronic structure and chemical reactivity. The book starts by introducing the basic concepts of modern electronic structure/reactivity theory based upon the Density Functional Theory (DFT), followed by an outline of the main ideas and techniques of IT, including several illustrative applications to molecular systems. Coverage includes information origins of the chemical bond, unbiased definition of molecular fragments, adequate entropic measures of their internal (intra-fragment) and external (inter-fragment) bond-orders and valence-numbers, descriptors of their chemical reactivity, and information criteria of their similarity and independence. Information Theory of Molecular Systems is recommended to graduate students and researchers interested in fresh ideas in the theory of electronic structure and chemical reactivity.·Provides powerful tools for tackling both classical and new problems in the theory of the molecular electronic structure and chemical reactivity·Introduces basic concepts of the modern electronic structure/reactivity theory based upon the Density Functional Theory (DFT)·Outlines main ideas and techniques of Information Theory


Information Theory and Language

Information Theory and Language

Author: Łukasz Dębowski

Publisher: MDPI

Published: 2020-12-15

Total Pages: 244

ISBN-13: 3039360264

DOWNLOAD EBOOK

“Information Theory and Language” is a collection of 12 articles that appeared recently in Entropy as part of a Special Issue of the same title. These contributions represent state-of-the-art interdisciplinary research at the interface of information theory and language studies. They concern in particular: • Applications of information theoretic concepts such as Shannon and Rényi entropies, mutual information, and rate–distortion curves to the research of natural languages; • Mathematical work in information theory inspired by natural language phenomena, such as deriving moments of subword complexity or proving continuity of mutual information; • Empirical and theoretical investigation of quantitative laws of natural language such as Zipf’s law, Herdan’s law, and Menzerath–Altmann’s law; • Empirical and theoretical investigations of statistical language models, including recently developed neural language models, their entropies, and other parameters; • Standardizing language resources for statistical investigation of natural language; • Other topics concerning semantics, syntax, and critical phenomena. Whereas the traditional divide between probabilistic and formal approaches to human language, cultivated in the disjoint scholarships of natural sciences and humanities, has been blurred in recent years, this book can contribute to pointing out potential areas of future research cross-fertilization.


Elements of Information Theory

Elements of Information Theory

Author: Thomas M. Cover

Publisher: John Wiley & Sons

Published: 2012-11-28

Total Pages: 788

ISBN-13: 1118585771

DOWNLOAD EBOOK

The latest edition of this classic is updated with new problem sets and material The Second Edition of this fundamental textbook maintains the book's tradition of clear, thought-provoking instruction. Readers are provided once again with an instructive mix of mathematics, physics, statistics, and information theory. All the essential topics in information theory are covered in detail, including entropy, data compression, channel capacity, rate distortion, network information theory, and hypothesis testing. The authors provide readers with a solid understanding of the underlying theory and applications. Problem sets and a telegraphic summary at the end of each chapter further assist readers. The historical notes that follow each chapter recap the main points. The Second Edition features: * Chapters reorganized to improve teaching * 200 new problems * New material on source coding, portfolio theory, and feedback capacity * Updated references Now current and enhanced, the Second Edition of Elements of Information Theory remains the ideal textbook for upper-level undergraduate and graduate courses in electrical engineering, statistics, and telecommunications.


Information Theory, Inference and Learning Algorithms

Information Theory, Inference and Learning Algorithms

Author: David J. C. MacKay

Publisher: Cambridge University Press

Published: 2003-09-25

Total Pages: 694

ISBN-13: 9780521642989

DOWNLOAD EBOOK

Information theory and inference, taught together in this exciting textbook, lie at the heart of many important areas of modern technology - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics and cryptography. The book introduces theory in tandem with applications. Information theory is taught alongside practical communication systems such as arithmetic coding for data compression and sparse-graph codes for error-correction. Inference techniques, including message-passing algorithms, Monte Carlo methods and variational approximations, are developed alongside applications to clustering, convolutional codes, independent component analysis, and neural networks. Uniquely, the book covers state-of-the-art error-correcting codes, including low-density-parity-check codes, turbo codes, and digital fountain codes - the twenty-first-century standards for satellite communications, disk drives, and data broadcast. Richly illustrated, filled with worked examples and over 400 exercises, some with detailed solutions, the book is ideal for self-learning, and for undergraduate or graduate courses. It also provides an unparalleled entry point for professionals in areas as diverse as computational biology, financial engineering and machine learning.


Information Theory And Evolution (Third Edition)

Information Theory And Evolution (Third Edition)

Author: John Scales Avery

Publisher: World Scientific

Published: 2021-11-24

Total Pages: 329

ISBN-13: 9811250383

DOWNLOAD EBOOK

This highly interdisciplinary book discusses the phenomenon of life, including its origin and evolution, against the background of thermodynamics, statistical mechanics, and information theory. Among the central themes is the seeming contradiction between the second law of thermodynamics and the high degree of order and complexity produced by living systems. As the author shows, this paradox has its resolution in the information content of the Gibbs free energy that enters the biosphere from outside sources. Another focus of the book is the role of information in human cultural evolution, which is also discussed with the origin of human linguistic abilities. One of the final chapters addresses the merging of information technology and biotechnology into a new discipline — bioinformation technology.This third edition has been updated to reflect the latest scientific and technological advances. Professor Avery makes use of the perspectives of famous scholars such as Professor Noam Chomsky and Nobel Laureates John O'Keefe, May-Britt Moser and Edward Moser to cast light on the evolution of human languages. The mechanism of cell differentiation, and the rapid acceleration of information technology in the 21st century are also discussed.With various research disciplines becoming increasingly interrelated today, Information Theory and Evolution provides nuance to the conversation between bioinformatics, information technology, and pertinent social-political issues. This book is a welcome voice in working on the future challenges that humanity will face as a result of scientific and technological progress.