Limitations in today's software packages for financial modeling system development can threaten the viability of any system--not to mention the firm using that system. Modeling Financial Markets is the first book to take financial professionals beyond those limitations to introduce safer, more sophisticated modeling methods. It contains dozens of techniques for financial modeling in code that minimize or avoid current software deficiencies, and addresses the crucial crossover stage in which prototypes are converted to fully coded models.
The Oxford Handbook of Computational Economics and Finance provides a survey of both the foundations of and recent advances in the frontiers of analysis and action. It is both historically and interdisciplinarily rich and also tightly connected to the rise of digital society. It begins with the conventional view of computational economics, including recent algorithmic development in computing rational expectations, volatility, and general equilibrium. It then moves from traditional computing in economics and finance to recent developments in natural computing, including applications of nature-inspired intelligence, genetic programming, swarm intelligence, and fuzzy logic. Also examined are recent developments of network and agent-based computing in economics. How these approaches are applied is examined in chapters on such subjects as trading robots and automated markets. The last part deals with the epistemology of simulation in its trinity form with the integration of simulation, computation, and dynamics. Distinctive is the focus on natural computationalism and the examination of the implications of intelligent machines for the future of computational economics and finance. Not merely individual robots, but whole integrated systems are extending their "immigration" to the world of Homo sapiens, or symbiogenesis.
The idea of writing this bookarosein 2000when the ?rst author wasassigned to teach the required course STATS 240 (Statistical Methods in Finance) in the new M. S. program in ?nancial mathematics at Stanford, which is an interdisciplinary program that aims to provide a master’s-level education in applied mathematics, statistics, computing, ?nance, and economics. Students in the programhad di?erent backgroundsin statistics. Some had only taken a basic course in statistical inference, while others had taken a broad spectrum of M. S. - and Ph. D. -level statistics courses. On the other hand, all of them had already taken required core courses in investment theory and derivative pricing, and STATS 240 was supposed to link the theory and pricing formulas to real-world data and pricing or investment strategies. Besides students in theprogram,thecoursealso attractedmanystudentsfromother departments in the university, further increasing the heterogeneity of students, as many of them had a strong background in mathematical and statistical modeling from the mathematical, physical, and engineering sciences but no previous experience in ?nance. To address the diversity in background but common strong interest in the subject and in a potential career as a “quant” in the ?nancialindustry,thecoursematerialwascarefullychosennotonlytopresent basic statistical methods of importance to quantitative ?nance but also to summarize domain knowledge in ?nance and show how it can be combined with statistical modeling in ?nancial analysis and decision making. The course material evolved over the years, especially after the second author helped as the head TA during the years 2004 and 2005.
An inside look at modern approaches to modeling equity portfolios Financial Modeling of the Equity Market is the most comprehensive, up-to-date guide to modeling equity portfolios. The book is intended for a wide range of quantitative analysts, practitioners, and students of finance. Without sacrificing mathematical rigor, it presents arguments in a concise and clear style with a wealth of real-world examples and practical simulations. This book presents all the major approaches to single-period return analysis, including modeling, estimation, and optimization issues. It covers both static and dynamic factor analysis, regime shifts, long-run modeling, and cointegration. Estimation issues, including dimensionality reduction, Bayesian estimates, the Black-Litterman model, and random coefficient models, are also covered in depth. Important advances in transaction cost measurement and modeling, robust optimization, and recent developments in optimization with higher moments are also discussed. Sergio M. Focardi (Paris, France) is a founding partner of the Paris-based consulting firm, The Intertek Group. He is a member of the editorial board of the Journal of Portfolio Management. He is also the author of numerous articles and books on financial modeling. Petter N. Kolm, PhD (New Haven, CT and New York, NY), is a graduate student in finance at the Yale School of Management and a financial consultant in New York City. Previously, he worked in the Quantitative Strategies Group of Goldman Sachs Asset Management, where he developed quantitative investment models and strategies.
The current financial crisis has revealed serious flaws in models, measures and, potentially, theories, that failed to provide forward-looking expectations for upcoming losses originated from market risks. The Proceedings of the Perm Winter School 2011 propose insights on many key issues and advances in financial markets modeling and risk measurement aiming to bridge the gap. The key addressed topics include: hierarchical and ultrametric models of financial crashes, dynamic hedging, arbitrage free modeling the term structure of interest rates, agent based modeling of order flow, asset pricing in a fractional market, hedge funds performance and many more.
The past twenty years have seen an extraordinary growth in the use of quantitative methods in financial markets. Finance professionals now routinely use sophisticated statistical techniques in portfolio management, proprietary trading, risk management, financial consulting, and securities regulation. This graduate-level textbook is intended for PhD students, advanced MBA students, and industry professionals interested in the econometrics of financial modeling. The book covers the entire spectrum of empirical finance, including: the predictability of asset returns, tests of the Random Walk Hypothesis, the microstructure of securities markets, event analysis, the Capital Asset Pricing Model and the Arbitrage Pricing Theory, the term structure of interest rates, dynamic models of economic equilibrium, and nonlinear financial models such as ARCH, neural networks, statistical fractals, and chaos theory. Each chapter develops statistical techniques within the context of a particular financial application. This exciting new text contains a unique and accessible combination of theory and practice, bringing state-of-the-art statistical techniques to the forefront of financial applications. Each chapter also includes a discussion of recent empirical evidence, for example, the rejection of the Random Walk Hypothesis, as well as problems designed to help readers incorporate what they have read into their own applications.
This book reconciles the existence of technical trading with the Efficient Market Hypothesis. By analyzing a well-known agent-based model, the Santa Fe Institute Artificial Stock Market (SFI-ASM), it finds that when selective forces are weak, financial evolution cannot guarantee that only the fittest trading rules will survive. Its main contribution lies in the application of standard results from population genetics which have widely been neglected in the agent-based community.
Financial Modelling in Commodity Markets provides a basic and self-contained introduction to the ideas underpinning financial modelling of products in commodity markets. The book offers a concise and operational vision of the main models used to represent, assess and simulate real assets and financial positions related to the commodity markets. It discusses statistical and mathematical tools important for estimating, implementing and calibrating quantitative models used for pricing and trading commodity-linked products and for managing basic and complex portfolio risks. Key features: Provides a step-by-step guide to the construction of pricing models, and for the applications of such models for the analysis of real data Written for scholars from a wide range of scientific fields, including economics and finance, mathematics, engineering and statistics, as well as for practitioners Illustrates some important pricing models using real data sets that will be commonly used in financial markets
Tools and methods from complex systems science can have a considerable impact on the way in which the quantitative assessment of economic and financial issues is approached, as discussed in this thesis. First it is shown that the self-organization of financial markets is a crucial factor in the understanding of their dynamics. In fact, using an agent-based approach, it is argued that financial markets’ stylized facts appear only in the self-organized state. Secondly, the thesis points out the potential of so-called big data science for financial market modeling, investigating how web-driven data can yield a picture of market activities: it has been found that web query volumes anticipate trade volumes. As a third achievement, the metrics developed here for country competitiveness and product complexity is groundbreaking in comparison to mainstream theories of economic growth and technological development. A key element in assessing the intangible variables determining the success of countries in the present globalized economy is represented by the diversification of the productive basket of countries. The comparison between the level of complexity of a country's productive system and economic indicators such as the GDP per capita discloses its hidden growth potential.