In Volatility and Correlation 2nd edition: The Perfect Hedger and the Fox, Rebonato looks at derivatives pricing from the angle of volatility and correlation. With both practical and theoretical applications, this is a thorough update of the highly successful Volatility & Correlation – with over 80% new or fully reworked material and is a must have both for practitioners and for students. The new and updated material includes a critical examination of the ‘perfect-replication’ approach to derivatives pricing, with special attention given to exotic options; a thorough analysis of the role of quadratic variation in derivatives pricing and hedging; a discussion of the informational efficiency of markets in commonly-used calibration and hedging practices. Treatment of new models including Variance Gamma, displaced diffusion, stochastic volatility for interest-rate smiles and equity/FX options. The book is split into four parts. Part I deals with a Black world without smiles, sets out the author’s ‘philosophical’ approach and covers deterministic volatility. Part II looks at smiles in equity and FX worlds. It begins with a review of relevant empirical information about smiles, and provides coverage of local-stochastic-volatility, general-stochastic-volatility, jump-diffusion and Variance-Gamma processes. Part II concludes with an important chapter that discusses if and to what extent one can dispense with an explicit specification of a model, and can directly prescribe the dynamics of the smile surface. Part III focusses on interest rates when the volatility is deterministic. Part IV extends this setting in order to account for smiles in a financially motivated and computationally tractable manner. In this final part the author deals with CEV processes, with diffusive stochastic volatility and with Markov-chain processes. Praise for the First Edition: “In this book, Dr Rebonato brings his penetrating eye to bear on option pricing and hedging.... The book is a must-read for those who already know the basics of options and are looking for an edge in applying the more sophisticated approaches that have recently been developed.” —Professor Ian Cooper, London Business School “Volatility and correlation are at the very core of all option pricing and hedging. In this book, Riccardo Rebonato presents the subject in his characteristically elegant and simple fashion...A rare combination of intellectual insight and practical common sense.” —Anthony Neuberger, London Business School
Section headings in this handbook include: 'Forecasting Methodology; 'Forecasting Models'; 'Forecasting with Different Data Structures'; and 'Applications of Forecasting Methods.'.
A poorly performing database application not only costs users time, but also has an impact on other applications running on the same computer or the same network. SQL Tuning provides an essential next step for SQL developers and database administrators who want to extend their SQL tuning expertise and get the most from their database applications.There are two basic issues to focus on when tuning SQL: how to find and interpret the execution plan of an SQL statement and how to change SQL to get a specific alternate execution plan. SQL Tuning provides answers to these questions and addresses a third issue that's even more important: how to find the optimal execution plan for the query to use.Author Dan Tow outlines a timesaving method he's developed for finding the optimum execution plan--rapidly and systematically--regardless of the complexity of the SQL or the database platform being used. You'll learn how to understand and control SQL execution plans and how to diagram SQL queries to deduce the best execution plan for a query. Key chapters in the book include exercises to reinforce the concepts you've learned. SQL Tuning concludes by addressing special concerns and unique solutions to "unsolvable problems."Whether you are a programmer who develops SQL-based applications or a database administrator or other who troubleshoots poorly tuned applications, SQL Tuning will arm you with a reliable and deterministic method for tuning your SQL queries to gain optimal performance.
Included here are expressions in the functional domain of such classics as linear regression, principal components analysis, linear modelling, and canonical correlation analysis, as well as specifically functional techniques such as curve registration and principal differential analysis. Data arising in real applications are used throughout for both motivation and illustration, showing how functional approaches allow us to see new things, especially by exploiting the smoothness of the processes generating the data. The data sets exemplify the wide scope of functional data analysis; they are drawn from growth analysis, meteorology, biomechanics, equine science, economics, and medicine. The book presents novel statistical technology while keeping the mathematical level widely accessible. It is designed to appeal to students, applied data analysts, and to experienced researchers; and as such is of value both within statistics and across a broad spectrum of other fields. Much of the material appears here for the first time.
Understanding the dynamic evolution of the yield curve is critical to many financial tasks, including pricing financial assets and their derivatives, managing financial risk, allocating portfolios, structuring fiscal debt, conducting monetary policy, and valuing capital goods. Unfortunately, most yield curve models tend to be theoretically rigorous but empirically disappointing, or empirically successful but theoretically lacking. In this book, Francis Diebold and Glenn Rudebusch propose two extensions of the classic yield curve model of Nelson and Siegel that are both theoretically rigorous and empirically successful. The first extension is the dynamic Nelson-Siegel model (DNS), while the second takes this dynamic version and makes it arbitrage-free (AFNS). Diebold and Rudebusch show how these two models are just slightly different implementations of a single unified approach to dynamic yield curve modeling and forecasting. They emphasize both descriptive and efficient-markets aspects, they pay special attention to the links between the yield curve and macroeconomic fundamentals, and they show why DNS and AFNS are likely to remain of lasting appeal even as alternative arbitrage-free models are developed. Based on the Econometric and Tinbergen Institutes Lectures, Yield Curve Modeling and Forecasting contains essential tools with enhanced utility for academics, central banks, governments, and industry.
Linux Kernel Module Programming Guide is for people who want to write kernel modules. It takes a hands-on approach starting with writing a small "hello, world" program, and quickly moves from there. Far from a boring text on programming, Linux Kernel Module Programming Guide has a lively style that entertains while it educates. An excellent guide for anyone wishing to get started on kernel module programming. *** Money raised from the sale of this book supports the development of free software and documentation.
This is the eBook version of the printed book. If the print book includes a CD-ROM, this content is not included within the eBook version. Advanced Linux Programming is divided into two parts. The first covers generic UNIX system services, but with a particular eye towards Linux specific information. This portion of the book will be of use even to advanced programmers who have worked with other Linux systems since it will cover Linux specific details and differences. For programmers without UNIX experience, it will be even more valuable. The second section covers material that is entirely Linux specific. These are truly advanced topics, and are the techniques that the gurus use to build great applications. While this book will focus mostly on the Application Programming Interface (API) provided by the Linux kernel and the C library, a preliminary introduction to the development tools available will allow all who purchase the book to make immediate use of Linux.
The global fixed income market is an enormous financial market whose value by far exceeds that of the public stock markets. The interbank market consists of interest rate derivatives, whose primary purpose is to manage interest rate risk. The credit market primarily consists of the bond market, which links investors to companies, institutions, and governments with borrowing needs. This dissertation takes an optimization perspective upon modeling both these areas of the fixed-income market. Legislators on the national markets require financial actors to value their financial assets in accordance with market prices. Thus, prices of many assets, which are not publicly traded, must be determined mathematically. The financial quantities needed for pricing are not directly observable but must be measured through solving inverse optimization problems. These measurements are based on the available market prices, which are observed with various degrees of measurement noise. For the interbank market, the relevant financial quantities consist of term structures of interest rates, which are curves displaying the market rates for different maturities. For the bond market, credit risk is an additional factor that can be modeled through default intensity curves and term structures of recovery rates in case of default. By formulating suitable optimization models, the different underlying financial quantities can be measured in accordance with observable market prices, while conditions for economic realism are imposed. Measuring and managing risk is closely connected to the measurement of the underlying financial quantities. Through a data-driven method, we can show that six systematic risk factors can be used to explain almost all variance in the interest rate curves. By modeling the dynamics of these six risk factors, possible outcomes can be simulated in the form of term structure scenarios. For short-term simulation horizons, this results in a representation of the portfolio value distribution that is consistent with the realized outcomes from historically observed term structures. This enables more accurate measurements of interest rate risk, where our proposed method exhibits both lower risk and lower pricing errors compared to traditional models. We propose a method for decomposing changes in portfolio values for an arbitrary portfolio into the risk factors that affect the value of each instrument. By demonstrating the method for the six systematic risk factors identified for the interbank market, we show that almost all changes in portfolio value and portfolio variance can be attributed to these risk factors. Additional risk factors and approximation errors are gathered into two terms, which can be studied to ensure the quality of the performance attribution, and possibly improve it. To eliminate undesired risk within trading books, banks use hedging. Traditional methods do not take transaction costs into account. We, therefore, propose a method for managing the risks in the interbank market through a stochastic optimization model that considers transaction costs. This method is based on a scenario approximation of the optimization problem where the six systematic risk factors are simulated, and the portfolio variance is weighted against the transaction costs. This results in a method that is preferred over the traditional methods for all risk-averse investors. For the credit market, we use data from the bond market in combination with the interbank market to make accurate measurements of the financial quantities. We address the notoriously difficult problem of separating default risk from recovery risk. In addition to the previous identified six systematic risk factors for risk-free interests, we identify four risk factors that explain almost all variance in default intensities, while a single risk factor seems sufficient to model the recovery risk. Overall, this is a higher number of risk factors than is usually found in the literature. Through a simple model, we can measure the variance in bond prices in terms of these systematic risk factors, and through performance attribution, we relate these values to the empirically realized variances from the quoted bond prices. De globala ränte- och kreditmarknaderna är enorma finansiella marknader vars sammanlagda värden vida överstiger de publika aktiemarknadernas. Räntemarknaden består av räntederivat vars främsta användningsområde är hantering av ränterisker. Kreditmarknaden utgörs i första hand av obligationsmarknaden som syftar till att förmedla pengar från investerare till företag, institutioner och stater med upplåningsbehov. Denna avhandling fokuserar på att utifrån ett optimeringsperspektiv modellera både ränte- och obligationsmarknaden. Lagstiftarna på de nationella marknaderna kräver att de finansiella aktörerna värderar sina finansiella tillgångar i enlighet med marknadspriser. Därmed måste priserna på många instrument, som inte handlas publikt, beräknas matematiskt. De finansiella storheter som krävs för denna prissättning är inte direkt observerbara, utan måste mätas genom att lösa inversa optimeringsproblem. Dessa mätningar görs utifrån tillgängliga marknadspriser, som observeras med varierande grad av mätbrus. För räntemarknaden utgörs de relevanta finansiella storheterna av räntekurvor som åskådliggör marknadsräntorna för olika löptider. För obligationsmarknaden utgör kreditrisken en ytterligare faktor som modelleras via fallissemangsintensitetskurvor och kurvor kopplade till förväntat återvunnet kapital vid eventuellt fallissemang. Genom att formulera lämpliga optimeringsmodeller kan de olika underliggande finansiella storheterna mätas i enlighet med observerbara marknadspriser samtidigt som ekonomisk realism eftersträvas. Mätning och hantering av risker är nära kopplat till mätningen av de underliggande finansiella storheterna. Genom en datadriven metod kan vi visa att sex systematiska riskfaktorer kan användas för att förklara nästan all varians i räntekurvorna. Genom att modellera dynamiken i dessa sex riskfaktorer kan tänkbara utfall för räntekurvor simuleras. För kortsiktiga simuleringshorisonter resulterar detta i en representation av fördelningen av portföljvärden som väl överensstämmer med de realiserade utfallen från historiskt observerade räntekurvor. Detta möjliggör noggrannare mätningar av ränterisk där vår föreslagna metod uppvisar såväl lägre risk som mindre prissättningsfel jämfört med traditionella modeller. Vi föreslår en metod för att dekomponera portföljutvecklingen för en godtycklig portfölj till de riskfaktorer som påverkar värdet för respektive instrument. Genom att demonstrera metoden för de sex systematiska riskfaktorerna som identifierats för räntemarknaden visar vi att nästan all portföljutveckling och portföljvarians kan härledas till dessa riskfaktorer. Övriga riskfaktorer och approximationsfel samlas i två termer, vilka kan användas för att säkerställa och eventuellt förbättra kvaliteten i prestationshärledningen. För att eliminera oönskad risk i sina tradingböcker använder banker sig av hedging. Traditionella metoder tar ingen hänsyn till transaktionskostnader. Vi föreslår därför en metod för att hantera riskerna på räntemarknaden genom en stokastisk optimeringsmodell som också tar hänsyn till transaktionskostnader. Denna metod bygger på en scenarioapproximation av optimeringsproblemet där de sex systematiska riskfaktorerna simuleras och portföljvariansen vägs mot transaktionskostnaderna. Detta resulterar i en metod som, för alla riskaverta investerare, är att föredra framför de traditionella metoderna. På kreditmarknaden använder vi data från obligationsmarknaden i kombination räntemarknaden för att göra noggranna mätningar av de finansiella storheterna. Vi angriper det erkänt svåra problemet att separera fallissemangsrisk från återvinningsrisk. Förutom de tidigare sex systematiska riskfaktorerna för riskfri ränta, identifierar vi fyra riskfaktorer som förklarar nästan all varians i fallissemangsintensiteter, medan en enda riskfaktor tycks räcka för att modellera återvinningsrisken. Sammanlagt är detta ett större antal riskfaktorer än vad som brukar användas i litteraturen. Via en enkel modell kan vi mäta variansen i obligationspriser i termer av dessa systematiska riskfaktorer och genom prestationshärledningen relatera dessa värden till de empiriskt realiserade varianserna från kvoterade obligationspriser.
Memory forensics provides cutting edge technology to help investigate digital attacks Memory forensics is the art of analyzing computer memory (RAM) to solve digital crimes. As a follow-up to the best seller Malware Analyst's Cookbook, experts in the fields of malware, security, and digital forensics bring you a step-by-step guide to memory forensics—now the most sought after skill in the digital forensics and incident response fields. Beginning with introductory concepts and moving toward the advanced, The Art of Memory Forensics: Detecting Malware and Threats in Windows, Linux, and Mac Memory is based on a five day training course that the authors have presented to hundreds of students. It is the only book on the market that focuses exclusively on memory forensics and how to deploy such techniques properly. Discover memory forensics techniques: How volatile memory analysis improves digital investigations Proper investigative steps for detecting stealth malware and advanced threats How to use free, open source tools for conducting thorough memory forensics Ways to acquire memory from suspect systems in a forensically sound manner The next era of malware and security breaches are more sophisticated and targeted, and the volatile memory of a computer is often overlooked or destroyed as part of the incident response process. The Art of Memory Forensics explains the latest technological innovations in digital forensics to help bridge this gap. It covers the most popular and recently released versions of Windows, Linux, and Mac, including both the 32 and 64-bit editions.
Device drivers literally drive everything you're interested in--disks, monitors, keyboards, modems--everything outside the computer chip and memory. And writing device drivers is one of the few areas of programming for the Linux operating system that calls for unique, Linux-specific knowledge. For years now, programmers have relied on the classic Linux Device Drivers from O'Reilly to master this critical subject. Now in its third edition, this bestselling guide provides all the information you'll need to write drivers for a wide range of devices.Over the years the book has helped countless programmers learn: how to support computer peripherals under the Linux operating system how to develop and write software for new hardware under Linux the basics of Linux operation even if they are not expecting to write a driver The new edition of Linux Device Drivers is better than ever. The book covers all the significant changes to Version 2.6 of the Linux kernel, which simplifies many activities, and contains subtle new features that can make a driver both more efficient and more flexible. Readers will find new chapters on important types of drivers not covered previously, such as consoles, USB drivers, and more.Best of all, you don't have to be a kernel hacker to understand and enjoy this book. All you need is an understanding of the C programming language and some background in Unix system calls. And for maximum ease-of-use, the book uses full-featured examples that you can compile and run without special hardware.Today Linux holds fast as the most rapidly growing segment of the computer market and continues to win over enthusiastic adherents in many application areas. With this increasing support, Linux is now absolutely mainstream, and viewed as a solid platform for embedded systems. If you're writing device drivers, you'll want this book. In fact, you'll wonder how drivers are ever written without it.