"Contains numerous simple examples and illustrative diagrams....For anyone seeking information about eigenvalue inclusion theorems, this book will be a great reference." --Mathematical Reviews This book studies the original results, and their extensions, of the Russian mathematician S.A. Geršgorin who wrote a seminal paper in 1931 on how to easily obtain estimates of all n eigenvalues (characteristic values) of any given n-by-n complex matrix.
This IMA Volume in Mathematics and its Applications COMBINATORIAL AND GRAPH-THEORETICAL PROBLEMS IN LINEAR ALGEBRA is based on the proceedings of a workshop that was an integral part of the 1991-92 IMA program on "Applied Linear Algebra." We are grateful to Richard Brualdi, George Cybenko, Alan George, Gene Golub, Mitchell Luskin, and Paul Van Dooren for planning and implementing the year-long program. We especially thank Richard Brualdi, Shmuel Friedland, and Victor Klee for organizing this workshop and editing the proceedings. The financial support of the National Science Foundation made the workshop possible. A vner Friedman Willard Miller, Jr. PREFACE The 1991-1992 program of the Institute for Mathematics and its Applications (IMA) was Applied Linear Algebra. As part of this program, a workshop on Com binatorial and Graph-theoretical Problems in Linear Algebra was held on November 11-15, 1991. The purpose of the workshop was to bring together in an informal setting the diverse group of people who work on problems in linear algebra and matrix theory in which combinatorial or graph~theoretic analysis is a major com ponent. Many of the participants of the workshop enjoyed the hospitality of the IMA for the entire fall quarter, in which the emphasis was discrete matrix analysis.
In the recently published book of E. Bodewig, Matrix Calculus some results of the earlier parts of this paper are mentioned. It is stated there that they are of theoretical interest, but have no practical value. In this paper it will be shown that they can easily be used for practical computations.
Matrix algebra is one of the most important areas of mathematics for data analysis and for statistical theory. This much-needed work presents the relevant aspects of the theory of matrix algebra for applications in statistics. It moves on to consider the various types of matrices encountered in statistics, such as projection matrices and positive definite matrices, and describes the special properties of those matrices. Finally, it covers numerical linear algebra, beginning with a discussion of the basics of numerical computations, and following up with accurate and efficient algorithms for factoring matrices, solving linear systems of equations, and extracting eigenvalues and eigenvectors.
A comprehensive, must-have handbook of matrix methods with a unique emphasis on statistical applications This timely book, A Matrix Handbook for Statisticians, provides a comprehensive, encyclopedic treatment of matrices as they relate to both statistical concepts and methodologies. Written by an experienced authority on matrices and statistical theory, this handbook is organized by topic rather than mathematical developments and includes numerous references to both the theory behind the methods and the applications of the methods. A uniform approach is applied to each chapter, which contains four parts: a definition followed by a list of results; a short list of references to related topics in the book; one or more references to proofs; and references to applications. The use of extensive cross-referencing to topics within the book and external referencing to proofs allows for definitions to be located easily as well as interrelationships among subject areas to be recognized. A Matrix Handbook for Statisticians addresses the need for matrix theory topics to be presented together in one book and features a collection of topics not found elsewhere under one cover. These topics include: Complex matrices A wide range of special matrices and their properties Special products and operators, such as the Kronecker product Partitioned and patterned matrices Matrix analysis and approximation Matrix optimization Majorization Random vectors and matrices Inequalities, such as probabilistic inequalities Additional topics, such as rank, eigenvalues, determinants, norms, generalized inverses, linear and quadratic equations, differentiation, and Jacobians, are also included. The book assumes a fundamental knowledge of vectors and matrices, maintains a reasonable level of abstraction when appropriate, and provides a comprehensive compendium of linear algebra results with use or potential use in statistics. A Matrix Handbook for Statisticians is an essential, one-of-a-kind book for graduate-level courses in advanced statistical studies including linear and nonlinear models, multivariate analysis, and statistical computing. It also serves as an excellent self-study guide for statistical researchers.
In this much-expanded second edition, author Yair Shapira presents new applications and a substantial extension of the original object-oriented framework to make this popular and comprehensive book even easier to understand and use. It not only introduces the C and C++ programming languages, but also shows how to use them in the numerical solution of partial differential equations (PDEs). The book leads readers through the entire solution process, from the original PDE, through the discretization stage, to the numerical solution of the resulting algebraic system. The high level of abstraction available in C++ is particularly useful in the implementation of complex mathematical objects, such as unstructured mesh, sparse matrix, and multigrid hierarchy, often used in numerical modeling. The well-debugged and tested code segments implement the numerical methods efficiently and transparently in a unified object-oriented approach.
This volume concisely presents fundamental ideas, results, and techniques in linear algebra and mainly matrix theory. Each chapter focuses on the results, techniques, and methods that are beautiful, interesting, and representative, followed by carefully selected problems. For many theorems several different proofs are given. The only prerequisites are a decent background in elementary linear algebra and calculus.
This extraordinary three-volume work, written in an engaging and rigorous style by a world authority in the field, provides an accessible, comprehensive introduction to the full spectrum of mathematical and statistical techniques underpinning contemporary methods in data-driven learning and inference. This first volume, Foundations, introduces core topics in inference and learning, such as matrix theory, linear algebra, random variables, convex optimization and stochastic optimization, and prepares students for studying their practical application in later volumes. A consistent structure and pedagogy is employed throughout this volume to reinforce student understanding, with over 600 end-of-chapter problems (including solutions for instructors), 100 figures, 180 solved examples, datasets and downloadable Matlab code. Supported by sister volumes Inference and Learning, and unique in its scale and depth, this textbook sequence is ideal for early-career researchers and graduate students across many courses in signal processing, machine learning, statistical analysis, data science and inference.