This powerful, provocative survey is organized around the key issues of Afro-American history: Africa and slavery, family, religion, sex and racism, politics, economics, education, criminal justice, discrimination and protest movements, and black nationalism.
Long-memory processes are known to play an important part in many areas of science and technology, including physics, geophysics, hydrology, telecommunications, economics, finance, climatology, and network engineering. In the last 20 years enormous progress has been made in understanding the probabilistic foundations and statistical principles of such processes. This book provides a timely and comprehensive review, including a thorough discussion of mathematical and probabilistic foundations and statistical methods, emphasizing their practical motivation and mathematical justification. Proofs of the main theorems are provided and data examples illustrate practical aspects. This book will be a valuable resource for researchers and graduate students in statistics, mathematics, econometrics and other quantitative areas, as well as for practitioners and applied researchers who need to analyze data in which long memory, power laws, self-similar scaling or fractal properties are relevant.
Statistical Methods for Long Term Memory Processes covers the diverse statistical methods and applications for data with long-range dependence. Presenting material that previously appeared only in journals, the author provides a concise and effective overview of probabilistic foundations, statistical methods, and applications. The material emphasizes basic principles and practical applications and provides an integrated perspective of both theory and practice. This book explores data sets from a wide range of disciplines, such as hydrology, climatology, telecommunications engineering, and high-precision physical measurement. The data sets are conveniently compiled in the index, and this allows readers to view statistical approaches in a practical context. Statistical Methods for Long Term Memory Processes also supplies S-PLUS programs for the major methods discussed. This feature allows the practitioner to apply long memory processes in daily data analysis. For newcomers to the area, the first three chapters provide the basic knowledge necessary for understanding the remainder of the material. To promote selective reading, the author presents the chapters independently. Combining essential methodologies with real-life applications, this outstanding volume is and indispensable reference for statisticians and scientists who analyze data with long-range dependence.
The history is well known: On June 12, 1963, Mississippi's courageous NAACP chief, Medgar Evers, was gunned down by white supremacist Byron de la Beckwith. Tried twice by all-white juries, Beckwith escaped conviction for three decades. But then Mississippi began to confront its tormented past. And in the 1990s, when Beckwith was sent to jail by a crusading young prosecutor, the family of Medgar Evers finally got justice. Hailed as a New York Times Notable Book of the Year and a finalist for the Lillian Smith Award, Of Long Memory reveals how this remarkable reversal took place. Nossiter uses the tools of memory, history, and reportage—and the clear vantage point of an outsider, a Northerner—to portray an entire state quite literally summoning up its ghosts. A new epilogue discusses other civil rights cases now being reconsidered, and skillfully shows how the South is finding a way to create justice where none had existed before.
Assembles three different strands of long memory analysis: statistical literature on the properties of, and tests for, LRD processes; mathematical literature on the stochastic processes involved; and models from economic theory providing plausible micro foundations for the occurrence of long memory in economics.
Supervised sequence labelling is a vital area of machine learning, encompassing tasks such as speech, handwriting and gesture recognition, protein secondary structure prediction and part-of-speech tagging. Recurrent neural networks are powerful sequence learning tools—robust to input noise and distortion, able to exploit long-range contextual information—that would seem ideally suited to such problems. However their role in large-scale sequence labelling systems has so far been auxiliary. The goal of this book is a complete framework for classifying and transcribing sequential data with recurrent neural networks only. Three main innovations are introduced in order to realise this goal. Firstly, the connectionist temporal classification output layer allows the framework to be trained with unsegmented target sequences, such as phoneme-level speech transcriptions; this is in contrast to previous connectionist approaches, which were dependent on error-prone prior segmentation. Secondly, multidimensional recurrent neural networks extend the framework in a natural way to data with more than one spatio-temporal dimension, such as images and videos. Thirdly, the use of hierarchical subsampling makes it feasible to apply the framework to very large or high resolution sequences, such as raw audio or video. Experimental validation is provided by state-of-the-art results in speech and handwriting recognition.
How is information stored and retrieved from long-term memory? It is argued that any systematic attempt to answer this question should be based on a particular set of specific representational assumptions that have led to the development of a new memory theory -- the connectivity model. One of the crucial predictions of this model is that, in sharp contrast to traditional theories, the speed of processing information increases as the amount and complexity of integrated knowledge increases. In this volume, the predictions of the model are examined by analyzing the results of a variety of different experiments and by studying the outcome of the simulation program CONN1, which illustrates the representation of complex semantic structures. In the final chapter, the representational assumptions of the connectivity model are evaluated on the basis of neuroanatomical and physiological evidence -- suggesting that neuroscience provides valuable knowledge which should guide the development of memory theories.
Provides a simple exposition of the basic time series material, and insights into underlying technical aspects and methods of proof Long memory time series are characterized by a strong dependence between distant events. This book introduces readers to the theory and foundations of univariate time series analysis with a focus on long memory and fractional integration, which are embedded into the general framework. It presents the general theory of time series, including some issues that are not treated in other books on time series, such as ergodicity, persistence versus memory, asymptotic properties of the periodogram, and Whittle estimation. Further chapters address the general functional central limit theory, parametric and semiparametric estimation of the long memory parameter, and locally optimal tests. Intuitive and easy to read, Time Series Analysis with Long Memory in View offers chapters that cover: Stationary Processes; Moving Averages and Linear Processes; Frequency Domain Analysis; Differencing and Integration; Fractionally Integrated Processes; Sample Means; Parametric Estimators; Semiparametric Estimators; and Testing. It also discusses further topics. This book: Offers beginning-of-chapter examples as well as end-of-chapter technical arguments and proofs Contains many new results on long memory processes which have not appeared in previous and existing textbooks Takes a basic mathematics (Calculus) approach to the topic of time series analysis with long memory Contains 25 illustrative figures as well as lists of notations and acronyms Time Series Analysis with Long Memory in View is an ideal text for first year PhD students, researchers, and practitioners in statistics, econometrics, and any application area that uses time series over a long period. It would also benefit researchers, undergraduates, and practitioners in those areas who require a rigorous introduction to time series analysis.