The articles in this volume present the state of the art in a variety of areas of discrete probability, including random walks on finite and infinite graphs, random trees, renewal sequences, Stein's method for normal approximation and Kohonen-type self-organizing maps. This volume also focuses on discrete probability and its connections with the theory of algorithms. Classical topics in discrete mathematics are represented as are expositions that condense and make readable some recent work on Markov chains, potential theory and the second moment method. This volume is suitable for mathematicians and students.
Graph Theory has proved to be an extremely useful tool for solving combinatorial problems in such diverse areas as Geometry, Algebra, Number Theory, Topology, Operations Research and Optimization. It is natural to attempt to generalise the concept of a graph, in order to attack additional combinatorial problems. The idea of looking at a family of sets from this standpoint took shape around 1960. In regarding each set as a ``generalised edge'' and in calling the family itself a ``hypergraph'', the initial idea was to try to extend certain classical results of Graph Theory such as the theorems of Turán and König. It was noticed that this generalisation often led to simplification; moreover, one single statement, sometimes remarkably simple, could unify several theorems on graphs. This book presents what seems to be the most significant work on hypergraphs.
This book collects the extended abstracts of the accepted contributions to EuroComb21. A similar book is published at every edition of EuroComb (every two years since 2001) collecting the most recent advances in combinatorics, graph theory, and related areas. It has a wide audience in the areas, and the papers are used and referenced broadly.
In this book it explores science and technology, makes connections between these epistemic, cultural, and political trends, and develops profound insights into the nature of our postmodernity.
Graph theory is a fascinating and inviting branch of mathematics. Many problems are easy to state and have natural visual representations, inviting exploration by new students and professional mathematicians. The goal of this textbook is to present the fundamentals of graph theory to a wide range of readers. The book contains many significant recent results in graph theory, presented using up-to-date notation. The author included the shortest, most elegant, most intuitive proofs for modern and classic results while frequently presenting them in new ways. Major topics are introduced with practical applications that motivate their development, and which are illustrated with examples that show how to apply major theorems in practice. This includes the process of finding a brute force solution (case-checking) when an elegant solution is not apparent. With over 1200 exercises, internet resources (e.g., the OEIS for counting problems), helpful appendices, and a detailed guide to different course outlines, this book provides a versatile and convenient tool for the needs of instructors at a large variety of institutions.
There are many bits and pieces of folklore in mathematics that are passed down from advisor to student, or from collaborator to collaborator, but which are too fuzzy and nonrigorous to be discussed in the formal literature. Traditionally, it was a matter
An introduction to a broad range of topics in deep learning, covering mathematical and conceptual background, deep learning techniques used in industry, and research perspectives. “Written by three experts in the field, Deep Learning is the only comprehensive book on the subject.” —Elon Musk, cochair of OpenAI; cofounder and CEO of Tesla and SpaceX Deep learning is a form of machine learning that enables computers to learn from experience and understand the world in terms of a hierarchy of concepts. Because the computer gathers knowledge from experience, there is no need for a human computer operator to formally specify all the knowledge that the computer needs. The hierarchy of concepts allows the computer to learn complicated concepts by building them out of simpler ones; a graph of these hierarchies would be many layers deep. This book introduces a broad range of topics in deep learning. The text offers mathematical and conceptual background, covering relevant concepts in linear algebra, probability theory and information theory, numerical computation, and machine learning. It describes deep learning techniques used by practitioners in industry, including deep feedforward networks, regularization, optimization algorithms, convolutional networks, sequence modeling, and practical methodology; and it surveys such applications as natural language processing, speech recognition, computer vision, online recommendation systems, bioinformatics, and videogames. Finally, the book offers research perspectives, covering such theoretical topics as linear factor models, autoencoders, representation learning, structured probabilistic models, Monte Carlo methods, the partition function, approximate inference, and deep generative models. Deep Learning can be used by undergraduate or graduate students planning careers in either industry or research, and by software engineers who want to begin using deep learning in their products or platforms. A website offers supplementary material for both readers and instructors.