Author of the acclaimed work Iceberg Risk: An Adventure in Portfolio Theory, Kent Osband argues that uncertainty is central rather than marginal to finance. Markets don't trade mainly on changes in risk. They trade on changes in beliefs about risk, and in the process, markets unite, stretch, and occasionally defy beliefs. Recognizing this truth would make a world of difference in investing. Belittling uncertainty has created a rift between financial theory and practice and within finance theory itself, misguiding regulation and stoking huge financial imbalances. Sparking a revolution in the mindset of the investment professional, Osband recasts the market as a learning machine rather than a knowledge machine. The market continually errs, corrects itself, and makes new errors. Respecting that process, without idolizing it, will promote wiser investment, trading, and regulation. With uncertainty embedded at its core, Osband's rational approach points to a finance theory worthy of twenty-first-century investing.
This book is an introduction to financial mathematics. It is intended for graduate students in mathematics and for researchers working in academia and industry. The focus on stochastic models in discrete time has two immediate benefits. First, the probabilistic machinery is simpler, and one can discuss right away some of the key problems in the theory of pricing and hedging of financial derivatives. Second, the paradigm of a complete financial market, where all derivatives admit a perfect hedge, becomes the exception rather than the rule. Thus, the need to confront the intrinsic risks arising from market incomleteness appears at a very early stage. The first part of the book contains a study of a simple one-period model, which also serves as a building block for later developments. Topics include the characterization of arbitrage-free markets, preferences on asset profiles, an introduction to equilibrium analysis, and monetary measures of financial risk. In the second part, the idea of dynamic hedging of contingent claims is developed in a multiperiod framework. Topics include martingale measures, pricing formulas for derivatives, American options, superhedging, and hedging strategies with minimal shortfall risk. This fourth, newly revised edition contains more than one hundred exercises. It also includes material on risk measures and the related issue of model uncertainty, in particular a chapter on dynamic risk measures and sections on robust utility maximization and on efficient hedging with convex risk measures. Contents: Part I: Mathematical finance in one period Arbitrage theory Preferences Optimality and equilibrium Monetary measures of risk Part II: Dynamic hedging Dynamic arbitrage theory American contingent claims Superhedging Efficient hedging Hedging under constraints Minimizing the hedging error Dynamic risk measures
An introduction to a broad range of topics in deep learning, covering mathematical and conceptual background, deep learning techniques used in industry, and research perspectives. “Written by three experts in the field, Deep Learning is the only comprehensive book on the subject.” —Elon Musk, cochair of OpenAI; cofounder and CEO of Tesla and SpaceX Deep learning is a form of machine learning that enables computers to learn from experience and understand the world in terms of a hierarchy of concepts. Because the computer gathers knowledge from experience, there is no need for a human computer operator to formally specify all the knowledge that the computer needs. The hierarchy of concepts allows the computer to learn complicated concepts by building them out of simpler ones; a graph of these hierarchies would be many layers deep. This book introduces a broad range of topics in deep learning. The text offers mathematical and conceptual background, covering relevant concepts in linear algebra, probability theory and information theory, numerical computation, and machine learning. It describes deep learning techniques used by practitioners in industry, including deep feedforward networks, regularization, optimization algorithms, convolutional networks, sequence modeling, and practical methodology; and it surveys such applications as natural language processing, speech recognition, computer vision, online recommendation systems, bioinformatics, and videogames. Finally, the book offers research perspectives, covering such theoretical topics as linear factor models, autoencoders, representation learning, structured probabilistic models, Monte Carlo methods, the partition function, approximate inference, and deep generative models. Deep Learning can be used by undergraduate or graduate students planning careers in either industry or research, and by software engineers who want to begin using deep learning in their products or platforms. A website offers supplementary material for both readers and instructors.
This book examines non-Gaussian distributions. It addresses the causes and consequences of non-normality and time dependency in both asset returns and option prices. The book is written for non-mathematicians who want to model financial market prices so the emphasis throughout is on practice. There are abundant empirical illustrations of the models and techniques described, many of which could be equally applied to other financial time series.
Developing techniques for assessing various risks and calculating probabilities of ruin and survival are exciting topics for mathematically-inclined academics. For practicing actuaries and financial engineers, the resulting insights have provided enormous opportunities but also created serious challenges to overcome, thus facilitating closer cooperation between industries and academic institutions. In this book, several renown researchers with extensive interdisciplinary research experiences share their thoughts that, in one way or another, contribute to the betterment of practice and theory of decision making under uncertainty. Behavioral, cultural, mathematical, and statistical aspects of risk assessment and modelling have been explored, and have been often illustrated using real and simulated data. Topics range from financial and insurance risks to security-type risks, from one-dimensional to multi- and even infinite-dimensional risks. The articles in the book were written with a broad audience in mind and should provide enjoyable reading for those with university level degrees and/or those who have studied for accreditation by various actuarial and financial societies.
Rcpp is the glue that binds the power and versatility of R with the speed and efficiency of C++. With Rcpp, the transfer of data between R and C++ is nearly seamless, and high-performance statistical computing is finally accessible to most R users. Rcpp should be part of every statistician's toolbox. -- Michael Braun, MIT Sloan School of Management "Seamless R and C++ integration with Rcpp" is simply a wonderful book. For anyone who uses C/C++ and R, it is an indispensable resource. The writing is outstanding. A huge bonus is the section on applications. This section covers the matrix packages Armadillo and Eigen and the GNU Scientific Library as well as RInside which enables you to use R inside C++. These applications are what most of us need to know to really do scientific programming with R and C++. I love this book. -- Robert McCulloch, University of Chicago Booth School of Business Rcpp is now considered an essential package for anybody doing serious computational research using R. Dirk's book is an excellent companion and takes the reader from a gentle introduction to more advanced applications via numerous examples and efficiency enhancing gems. The book is packed with all you might have ever wanted to know about Rcpp, its cousins (RcppArmadillo, RcppEigen .etc.), modules, package development and sugar. Overall, this book is a must-have on your shelf. -- Sanjog Misra, UCLA Anderson School of Management The Rcpp package represents a major leap forward for scientific computations with R. With very few lines of C++ code, one has R's data structures readily at hand for further computations in C++. Hence, high-level numerical programming can be made in C++ almost as easily as in R, but often with a substantial speed gain. Dirk is a crucial person in these developments, and his book takes the reader from the first fragile steps on to using the full Rcpp machinery. A very recommended book! -- Søren Højsgaard, Department of Mathematical Sciences, Aalborg University, Denmark "Seamless R and C ++ Integration with Rcpp" provides the first comprehensive introduction to Rcpp. Rcpp has become the most widely-used language extension for R, and is deployed by over one-hundred different CRAN and BioConductor packages. Rcpp permits users to pass scalars, vectors, matrices, list or entire R objects back and forth between R and C++ with ease. This brings the depth of the R analysis framework together with the power, speed, and efficiency of C++. Dirk Eddelbuettel has been a contributor to CRAN for over a decade and maintains around twenty packages. He is the Debian/Ubuntu maintainer for R and other quantitative software, edits the CRAN Task Views for Finance and High-Performance Computing, is a co-founder of the annual R/Finance conference, and an editor of the Journal of Statistical Software. He holds a Ph.D. in Mathematical Economics from EHESS (Paris), and works in Chicago as a Senior Quantitative Analyst.
Since its original publication, Value at Risk has become the industry standard in risk management. Now in its Third Edition, this international bestseller addresses the fundamental changes in the field that have occurred across the globe in recent years. Philippe Jorion provides the most current information needed to understand and implement VAR-as well as manage newer dimensions of financial risk. Featured updates include: An increased emphasis on operational risk Using VAR for integrated risk management and to measure economic capital Applications of VAR to risk budgeting in investment management Discussion of new risk-management techniques, including extreme value theory, principal components, and copulas Extensive coverage of the recently finalized Basel II capital adequacy rules for commercial banks, integrated throughout the book A major new feature of the Third Edition is the addition of short questions and exercises at the end of each chapter, making it even easier to check progress. Detailed answers are posted on the companion web site www.pjorion.com/var/. The web site contains other materials, including additional questions that course instructors can assign to their students. Jorion leaves no stone unturned, addressing the building blocks of VAR from computing and backtesting models to forecasting risk and correlations. He outlines the use of VAR to measure and control risk for trading, for investment management, and for enterprise-wide risk management. He also points out key pitfalls to watch out for in risk-management systems. The value-at-risk approach continues to improve worldwide standards for managing numerous types of risk. Now more than ever, professionals can depend on Value at Risk for comprehensive, authoritative counsel on VAR, its application, and its results-and to keep ahead of the curve.
Research in the statistical analysis of extreme values has flourished over the past decade: new probability models, inference and data analysis techniques have been introduced; and new application areas have been explored. Statistics of Extremes comprehensively covers a wide range of models and application areas, including risk and insurance: a major area of interest and relevance to extreme value theory. Case studies are introduced providing a good balance of theory and application of each model discussed, incorporating many illustrated examples and plots of data. The last part of the book covers some interesting advanced topics, including time series, regression, multivariate and Bayesian modelling of extremes, the use of which has huge potential.