Marc Paolella, Sven Christian Steude, Risk Prediction: A DWARF-like Approach, The Journal of Risk Model Validation, Vol. 2 (1), 2008. (Journal Article)
A large proportion of the most viable time series models used in empirical finance for density and value-at-risk forecasting are estimated with maximum likelihood methods. By way of its definition, the likelihood implicitly places equal weight on each of the observations in the sample, but this need not be optimal, depending on the extent to which the model and the true data generating process deviate. For example, in the context of modeling financial asset returns, schemes that place relatively more weight on observations in the recent past result in considerable improvement of out-of-sample density forecasts, compared with the default of equal weights. If instead of accurate forecasting of the, entire density, interest is restricted to just downside risk and risk model validation, then it would seem wise to (also) place more weight on the negative observations in the sample. In this paper, such weighted likelihood schemes are proposed and demonstrated to yield considerable improvements in forecast accuracy using a variety of data sets and different GARCH models. Further improvement is realized by combining the two weighting schemes, giving rise to a doubly weighted asymmetric risk forecasting method or, in short, a DWARF-like method. |
|
Simon Broda, a S.A.F.E.approach to risk: Saddlepoint Approximations in Financial Econometrics., University of Zurich, Faculty of Business, Economics and Informatics, 2007. (Dissertation)
|
|
Sven Christian Steude, Accurate Risk and Density Prediction of Asset Prices, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2007. (Dissertation)
|
|
Simon Broda, Marc Paolella, Kai Carstensen, Bias-adjusted estimation in the ARX(1) model, Computational Statistics and Data Analysis, Vol. 51 (7), 2007. (Journal Article)
A new point estimator for the AR(1) coefficient in the linear regression model with arbitrary exogenous regressors and stationary AR(1) disturbances is developed. Its construction parallels that of the median--unbiased estimator, but uses the mode as a measure of central tendency. The mean--adjusted estimator is also considered, and saddlepoint approximations are used to lower the computational burden of all the estimators. Large--scale simulation studies for assessing their small--sample properties are conducted. Their relative performance depends almost exclusively on the value of the autoregressive parameter, with the new estimator dominating over a large part of the parameter space. |
|
Simon Broda, Marc Paolella, Saddlepoint approximations for the doubly noncentral t distribution, Computational Statistics and Data Analysis, Vol. 51 (6), 2007. (Journal Article)
Closed-form approximations for the density and cumulative distribution function of the doubly noncentral t distribution are developed based on saddlepoint methods. They exhibit remarkable accuracy throughout the entire support of the distribution and are vastly superior to existing approximations. An application in finance is considered which capitalizes on the enormous increase in computational speed. |
|
Marc Paolella, Intermediate Probability : A Computational Approach, John Wiley & Sons, West Sussex, England, 2007. (Book/Research Monograph)
Intermediate Probability is the natural extension of the author's Fundamental Probability. It details several highly important topics, from standard ones such as order statistics, multivariate normal, and convergence concepts, to more advanced ones which are usually not addressed at this mathematical level, or have never previously appeared in textbook form. The author adopts a computational approach throughout, allowing the reader to directly implement the methods, thus greatly enhancing the learning experience and clearly illustrating the applicability, strengths, and weaknesses of the theory. |
|
Marc Paolella, Fundamental Probability : A Computational Approach, John Wiley & Sons, West Sussex, England, 2006. (Book/Research Monograph)
Probability is a vital measure in numerous disciplines, from bioinformatics and econometrics to finance/insurance and computer science. Developed from a successful course, Fundamental Probability provides an engaging and hands-on introduction to this important topic. Whilst the theory is explored in detail, this book also emphasises practical applications, with the presentation of a large variety of examples and exercises, along with generous use of computational tools. |
|
Markus Haas, Stefan Mittnik, Marc Paolella, Modeling and predicting market risk with Laplace-Gaussian mixture distributions, Applied Financial Economics, Vol. 16 (15), 2006. (Journal Article)
|
|
Marc Paolella, Christoph Hartz, Stefan Mittnik, Accurate value-at-risk forecasting based on the Normal-GARCH model, Computational Statistics & Data Analysis, Vol. 51 (4), 2006. (Journal Article)
A resampling method based on the bootstrap and a bias-correction step is developed for improving the Value-at-Risk (VaR) forecasting ability of the normal-GARCH model. Compared to the use of more sophisticated GARCH models, the new method is fast, easy to implement, numerically reliable, and, except for having to choose a window length L for the bias-correction step, fully data driven. The results for several different financial asset returns over a long out-of-sample forecasting period, as well as use of simulated data, strongly support use of the new method, and the performance is not sensitive to the choice of L. |
|
Keith Kuester, Stefan Mittnik, Marc Paolella, Value-at-risk prediction: A comparison of alternative strategies, Journal of Financial Econometrics, Vol. 4 (1), 2006. (Journal Article)
|
|
Marc Paolella, Markus Haas, Stefan Mittnik, Mixed normal conditional heteroskedasticity, Journal of Financial Econometrics, Vol. 2 (2), 2004. (Journal Article)
|
|
Marc Paolella, Modeling higher frequency macroeconomic data: an application to German monthly money demand, Applied Economics Quarterly (Konjunkturpolitik), Vol. 50 (2), 2004. (Journal Article)
|
|
Marc Paolella, Markus Haas, Stefan Mittnik, A new approach to markov-switching GARCH models, Journal of Financial Econometrics, Vol. 2 (4), 2004. (Journal Article)
The use of Markov-switching models to capture the volatility dynamics of financial time series has grown considerably during past years, in part because they give rise to a plausible interpretation of nonlinearities. Nevertheless, GARCH-type models remain ubiquitous in order to allow for nonlinearities associated with time-varying volatility. Existing methods of combining the two approaches are unsatisfactory, as they either suffer from severe estimation difficulties or else their dynamic properties are not well understood. In this article we present a new Markov-switching GARCH model that overcomes both of these problems. Dynamic properties are derived and their implications for the volatility process discussed. We argue that the disaggregation of the variance process offered by the new model is more plausible than in the existing variants. The approach is illustrated with several exchange rate return series. The results suggest that a promising volatility model is an independent switching GARCH process with a possibly skewed conditional mixture density |
|
Marc Paolella, K Carstensen, On Median Unbiased Inference for First Order Autoregressive Models, In: Contributions to Modern Econometrics: From Data Analysis to Economic Policy, Kluwer Academic Publishers, New York, p. 23 - 38, 2003. (Book Chapter)
|
|
Marc Paolella, Stefan Mittnik, Prediciton of Financial Downside-Risk with Heavy-Tailed Conditional Distributions, In: Handbook of Heavy-Tailed Distributions in Finance, Elsevier North–Holland, Amsterdam, p. 387 - 403, 2003. (Book Chapter)
|
|
Marc Paolella, Computing moments of ratios of quadratic forms in normal variables, Computational Statistics & Data Analysis, Vol. 42 (3), 2003. (Journal Article)
The accuracy and speed of numerical methods for computing the moments of a ratio of quadratic forms in normal variables is examined, with particular application to the sample autocorrelation function. Methods based on a saddlepoint approximation are demonstrated to be not only superior to existing approximations, but are numerically reliable and virtually as accurate as the method suitable for exact computations, while taking only a fraction of the time to compute. The new method also maintains its accuracy for time series models near the nonstationary border, which is of significant interest for unit-root inference and also a case for which first-order mean and variance approximations break down. As a wide variety of test statistics and their power functions arising in econometric models are expressible in the general form considered, the method should prove very useful for data analysis and model building. |
|
Stefan Mittnik, Marc Paolella, Svetlozar T Rachev, Stationarity of stable power-GARCH processes, Journal of Econometrics, Vol. 106 (1), 2002. (Journal Article)
|
|
R W Butler, Marc Paolella, Calculating the density and distribution function for the singly and doubly noncentral F, Statistics and Computing, Vol. 12 (1), 2002. (Journal Article)
|
|
Ronald W Butler, Marc Paolella, Saddlepoint approximation and bootstrap inference for the Satterthwaite class of ratios, Journal of the American Statistical Association, Vol. 97 (459), 2002. (Journal Article)
|
|
Marc Paolella, Testing the stable Paretian assumption, Mathematical and Computer Modelling, Vol. 34 (9-11), 2001. (Journal Article)
|
|