Ludovic Mathys, Valuing Tradeability in Exponential Lévy Models, Quantitative Finance and Economics, Vol. 4 (3), 2020. (Journal Article)
The present article provides a novel theoretical way to evaluate tradeability in markets of ordinary exponential Lévy type. We consider non-tradeability as a particular type of market illiquidity and investigate its impact on the price of the assets. Starting from an adaption of the continuous-time optional asset replacement problem initiated by McDonald and Siegel (1986), we derive tradeability premiums and subsequently characterize them in terms of free-boundary problems. This provides a simple way to compute non-tradeability values, e.g. by means of standard numerical techniques, and, in particular, to express the price of a non-tradeable asset as a percentage of the price of a tradeable equivalent. Our approach is illustrated via numerical examples where we discuss various properties of the tradeability premiums. |
|
Katarina Kolesarova, XVAs and their impact on the pricing of derivatives, University of Zurich, Faculty of Business, Economics and Informatics, 2020. (Master's Thesis)
XVAs represent a group of valuation adjustments that are calculated on top of the fair value of derivatives in order to account for various costs associated with derivatives business. (Gregory 2015, 2) The increasing use of XVAs has accumulated to a value in the magnitude of several billion USD over the last decade and XVAs have become a topic of controversial discussions across the industry. (PwC 2015, 2-3) This thesis examines the rationale linked to the use of XVAs and provides practical examples of XVA pricing on a sample portfolio of derivatives with the objective to assess the materiality of XVA adjustments. The XVAs that are analysed in detail include CVA, DVA, KVA and FVA. The portfolio consists of three one-year instruments, a forward, call and put option and two 7-year instruments including interest rate swap and cross currency swap. The necessary inputs into the XVA calculations including exposure profiles are simulated using Monte Carlo simulation using 100 000 iterations. The evolution of the simulated input variables is assumed to follow Brownian motion for the majority of instruments. It is shown that the materiality varies for each derivative with higher XVA values observed for instruments with longer maturity. The relative magnitude of XVAs differs with CVA being the most material adjustment in case of interest rate swap while FVA playing the most important role for derivatives with only one-sided positive exposure profiles such as options. Since the materiality of XVAs directly depends on the assumptions regarding value of calculation parameters, several input variables including credit quality and funding spread are subject to stress testing. Three stress scenarios of various severity are applied to the derivatives in the sample portfolio showing that the size of XVAs can fluctuate dramatically in times of stress, multiplying to several times the original value. The results indicate that the values of XVAs are far from negligible as they have the potential to grow and lead to high losses, especially for institutions with large portfolios of uncollateralised derivatives. |
|
Henry Mario Twerenbold, Developing new Merger Arbitrage Strategy using Google Trends Dataset, University of Zurich, Faculty of Business, Economics and Informatics, 2020. (Master's Thesis)
In this thesis the forecasting ability of a merger event is measured by using search frequency from the Google Trends database as a proxy for investor attention. Former papers have already proved that this publicly available dataset provides the opportunity to detect economic and social trends by analyzing the searching patterns of internet users on the Google platform. Finance related research in this field has expanded as well since the data became accessible in 2004, but the investigation of a merger event in combination with Google Trends remains almost unexplored. Thus, I strived to take the analysis of this dataset to the next level by following a novel anticipation methodology.
The sample of the study covers mergers of US targets from 2010 to 2019 with a transaction value of at least 10 billion USD of a total of 103 target firms and 82 acquirer firms. From each individual merger a 60-day data period prior to the announcement is assessed and combined with the other mergers to mean values. It will be analyzed if target names (T) and acquirer names (A) had been searched on Google on a higher frequency than usual. Furthermore, the same process is done for the combination of target and acquirer names in a single inquiry (TA) and a combination of the target or acquirer name with the keyword "Merger" (TM/AM).
The results of the daily internet inquiry have shown a continuous abnormal rise in the searching requests for the names of target (T) and acquirer (A) and the combination (TA) starting one to five days prior to the merger announcement. In addition, higher volatility of the searching intensity is detected particularly for the searching term (TA). Furthermore, the correlation between the search intensity of target and acquirer firm names increase significantly eight days prior to the announcement. In the same manner, the comparison of the search volume of the companies with its traded stock volume on the same day illustrates a higher level of correlation. After sorting out specific data points for the robustness tests, the outputs did not change enormously, which supports the previous findings. Moreover, I have analyzed the search volume data with a comparative setting of 115 benchmarking mergers in the same period. It generates big differences to the original sample in terms of abnormal search volume and correlation coefficient movements. Consequently, when the searching intensity differs extensively between real merger companies and other non-involved companies of the same industry sector, it gives evidence of an unconvential searching behavior for the real merger-involved companies prior to this event. As the final step, I examined a merger arbitrage strategy approach that includes the trading instrument Bollinger Bands. By diverting it from its intended use, the Bollinger Bands are applied as a trigger signal for the correlation coefficient between searching volumes and trading volumes of target and acquirer firms. With this simplified method, it is possible to anticipate a merger seven days prior to the announcement and profit from abnormal return.
The findings are very promising, but they have to be examined critically hereafter. Small searching volumes for specific individual target and acquirer firms are a problematic issue which may distort the mean values of the evaluation. But overall, the continuous upward movements of diverse calculation methods prior to the announcement shed light on a potential of this public database to benefit from a merger arbitrage strategy. It is particularly a novelty that acquirer data (A) and the combination of target and acquirer names (TA) are involved in the analysis. In earlier studies only target data (T) with the given dataset was assessed prior to a merger. In conclusion, the study represents the basis for further research in this field of digital merger traces through Google Trends. |
|
Erich Walter Farkas, Ludovic Mathys, Geometric Step Options with Jumps: Parity Relations, PIDEs, and Semi-Analytical Pricing, In: Swiss Finance Institute Research Paper, No. 20-11, 2020. (Working Paper)
The present article studies geometric step options in exponential Lévy markets. Our contribution is manifold and extends several aspects of the geometric step option pricing literature. First, we provide symmetry and parity relations and derive various characterizations for both European-type and American-type geometric double barrier step options. In particular, we are able to obtain a jump-diffusion disentanglement for the early exercise premium of American-type geometric double barrier step contracts and its maturity-randomized equivalent as well as to characterize the diffusion and jump contributions to these early exercise premiums separately by means of partial integro-differential equations and ordinary integro-differential equations. As an application of our characterizations, we derive semi-analytical pricing results for (regular) European-type and American-type geometric down-and-out step call options under hyper-exponential jump-diffusion models. Lastly, we use the latter results to discuss the early exercise structure of geometric step options once jumps are added and to subsequently provide an analysis of the impact of jumps on the price and hedging parameters of (European-type and American-type) geometric step contracts. |
|
Shiyu Qiu, Developing an Event Based Monitoring Model using information regarding account movements, University of Zurich, Faculty of Business, Economics and Informatics, 2020. (Master's Thesis)
The internal rating models currently used by banks to assess the solvability of small and medium-sized enterprises (SME) heavily rely on their yearly financial statements. In order to detect counterparties with looming credit deterioration early, we follow a different approach, which is based on continuous monitoring of the client's transaction data.
We do so by creating transactional features that summarize the operational condition, financial behavior, and liquidity of the counterparty across different time windows. By combing the transactional features with business sector information, we develop both a traditional logistic regression model as well as a machine learning XGBoost model. It turns out that the XGBoost model has a better performance on capturing relations between transaction data and yearly rating information, reflected by an area under the ROC curve (AUC) of 0.857.
In addition, we build specific models for clients belonging to different industries, finding that the models for the financial sector and the real estate sector perform the weakest, being due to insufficient data quantity and weak features, respectively. Therefore, we finally build three XGBoost models; one common model for the raw materials production, industry, construction, commerce, service and restaurant sectors, and two separate models for the particular sectors mentioned above.
The model for the six sectors exhibits a good and robust performance with an average AUC of 0.874, which illustrates the effectiveness of transaction data when it comes to early tell apart clients with good and bad ratings. We also discuss the choice of the threshold used to define “good” and “bad” rating classes and test the robustness of the models for further real application. |
|
Ludovic Mathys, Valuing Tradeability in Exponential Lévy Models, In: SSRN, No. 3482080, 2020. (Working Paper)
The present article provides a novel theoretical way to evaluate tradeability in markets of ordinary exponential Lévy type. We consider non-tradeability as a particular type of market illiquidity and investigate its impact on the price of the assets. Starting from an adaption of the continuous-time optional asset replacement problem initiated by McDonald and Siegel (1986), we derive tradeability premiums and subsequently characterize them in terms of free-boundary problems. This provides a simple way to compute non-tradeability values, e.g. by means of standard numerical techniques, and, in particular, to express the price of a non-tradeable asset as a percentage of the price of a tradeable equivalent. Our approach is illustrated via numerical examples where we discuss various properties of the tradeability premiums. |
|
Fabio Granato, What drives the swap spreads: the negative spread paradox, University of Zurich, Faculty of Business, Economics and Informatics, 2020. (Master's Thesis)
This work aims to uncover effects, and assess their magnitude, of macroeconomic factors and financial indicators which determined the U.S. swap spreads from 2004 to 2018. The variation of the drivers is incorporated into the dynamics of the swap spreads under the form of credit spread and liquidity risk premium variations. Taking this into consideration, this work is also comprehensive of a qualitative analysis of factors which determined and kept swap spreads in negative territory, and subsequently what consequences this had on risk management considerations for the actors of the after-crisis financial environment. To achieve that, the swap spread curve dynamics across all tenors, which constitute the dimension of the term structure itself, is collapsed in three main components (level, slope and curvature movements) via the Principal Components Analysis. This dimensionality reduction allows to operate in a more comfortable 3-dimensional space. The macroeconomic factors candidates are then regressed over each of these main components and the result show a significant influence of credit spread component (mainly determined by 3-months LIBOR rate) over the parallel shifts and at the same time evidence of liquidity factors (Volatility Index, TED spread and 3-months LIBOR-GC repo spread) influencing the slope of the swap spreads over the same period. The results are compared with previous studies and the main contribution to the financial literature is a solid and new analytical approach to the modelling of the swap spread term structure together with the uncover of new drivers for each isolated dynamic of the curve. This is coupled with considerations regarding the regulatory requirements for leverage ratios of banks which disincentivized banks from exploiting the arbitrage opportunity arising from negative swap spreads environment. |
|
Urban Ulrych, Nikola Vasiljevic, Optimal Currency Exposure Under Risk and Ambiguity Aversion, In: American Finance Association 2020 Annual Meeting. 2020. (Conference Presentation)
|
|
Erich Walter Farkas, Ludovic Mathys, Nikola Vasiljevic, Intra-Horizon Expected Shortfall and Risk Structure in Models with Jumps, In: Swiss Finance Institute Research Paper, No. 19-76, 2020. (Working Paper)
The present article deals with intra-horizon risk in models with jumps. Our general understanding of intra-horizon risk is along the lines of the approach taken in [BRSW04], [Ro08], [BMK09], [BP10], and [LV19]. In particular, we believe that quantifying market risk by strictly relying on point-in-time measures cannot be deemed a satisfactory approach in general. Instead, we argue that complementing this approach by studying measures of risk that capture the magnitude of losses potentially incurred at any time of a trading horizon is necessary when dealing with (m)any financial position(s). To address this issue, we propose an intra-horizon analogue of the expected shortfall for general profit and loss processes and discuss its key properties. Our intra-horizon expected shortfall is well-defined for (m)any popular class(es) of Levy processes encountered when modeling market dynamics and constitutes a coherent measure of risk, as introduced in [CDK04]. On the computational side, we provide a simple method to derive the intra-horizon risk inherent to popular Levy dynamics. Our general technique relies on results for maturity-randomized first-passage probabilities and allows for a derivation of diffusion and single jump risk contributions. These theoretical results are complemented with an empirical analysis, where popular Levy dynamics are calibrated to S&P 500 index data and an analysis of the resulting intra-horizon risk is presented. |
|
Urban Ulrych, Nikola Vasiljevic, Ambiguity and the Home Currency Bias, In: Swiss Finance Institute Research Paper, No. 20-73, 2020. (Working Paper)
This paper addresses the question of optimal currency exposure for a risk-and-ambiguity-averse international investor. A robust mean-variance model with smooth ambiguity preferences is used to derive the optimal currency exposure. In the theoretical part, we show that the sample-efficient currency demand can be calculated as the solution to a generalized ridge regression. Through the lens of these results, we demonstrate that our ambiguity-based model offers a new explanation of the home currency bias. The investor's dislike for model uncertainty induces a disproportionately high currency hedging demand. The empirical analysis of currency overlay strategies employs the foreign exchange, equity, and bond returns over the period from 1999 to 2018. Our out-of-sample back-tests illustrate that accounting for ambiguity enhances the stability of estimated optimal currency exposures and significantly improves the portfolio performance net of transaction costs. |
|
Cosima Patrizia Vester, Illiquidity Premia in Private Equity Investments, University of Zurich, Faculty of Business, Economics and Informatics, 2020. (Master's Thesis)
With deal volumes reaching record levels, valuations rising rapidly and a built-up in dry-powder private equity has become an immensely attractive asset class. Reflected by the amplified interest from investors seeking to diversify portfolios, private equity can offer investors significant excess return. Private equity investments are highly illiquid with long investment horizons, a densely concentrated ownership structure and high levels of leverage. These market structures facilitate the formation of a premium arising from liquidity risks and information asymmetries. This paper seeks to quantify this illiquidity premium. Through applying similarity measures and unsupervised clustering algorithms, we form a liquid alternative - defined as a public peer benchmark - based on a multidimensional set of attributes. Thus we extract the illiquidity premium defined as the excess return over the benchmark and compute a median of 9.7%. |
|
Yaqi Chen, Expansion Based Methods for Pricing Financial Options, University of Zurich, Faculty of Business, Economics and Informatics, 2020. (Master's Thesis)
This paper focuses on the Edgeworth expansions for financial option valuation using Hermite polynomials and logistic polynomials with the calibration of S&P 500 index option data. Our approach expresses the value of an option by replicating an infinite series of polynomials, whose coefficients are composed of variance, skewness, kurtosis, and higher moments of the underlying density distribution. This new formula is a computationally convenient alternative compared with Fourier transform method. Our analysis establishes two different tail conditions in order to work with the convergence of different series. Hermite series diverges for fat-tailed distributions while logistic series converges. In this paper, Heston (1993) model is applied and calibrated on S&P 500 index option data. All the MATLAB codes to achieve the model is contained in the appendix. |
|
Davide Marchini, Consistent Scenario Generation of Financial Time Series, University of Zurich, Faculty of Business, Economics and Informatics, 2020. (Master's Thesis)
The objective of the Thesis is to investigate novel methods for scenario generation beyond classic Monte Carlo simulation of independent sampled from a multivariate Gaussian distribution.
One of the main drawbacks of naive Monte Carlo methods is the difficulty to produce "likely" future scenarios, especially in a high dimensional context. This is due to the poor scalability of Yule-Walker like methods for multivariate Vector Autoregressive Models and the difficulty to insert structural constraint on the auto-cross correlation. Monte Carlo simulations are often performed using very strong assumptions (independent returns, lack of causal structure, etc.), that are likely to be rejected when facing real world data.
The topic of scenario generation is of primary importance in quantitative finance, where the ability to obtain more likely and accurate scenarios to compute risk figures and potential profits, maintaining computational feasibility, is undoubtedly a significant desire of many risk and asset managers.
Two main approaches are investigated. The first one is based on spectral density theory and can generate multivariate scenarios with a specified auto-cross causal correlation structure. The method applies some modifications to the algorithm proposed in Chambers, 1995, which is currently not publicly implemented in any software library. The second approach is based on a highly customized Deep Learning Architecture, trained in an adversarial setting. This method has the parameter capacity to potentially generate multivariate scenarios with even consistent non linear structure, generalizing the properties of the first algorithm.
The analysis is conducted over multivariate time series of financial assets, comprising stock indices, commodity futures and foreign exchange pairs at various frequencies.
Algorithms are evaluated in terms of their generative performance, the fidelity of synthetic samples to real world data, and for their computational feasibility in a production environment. |
|
Nicolas Ettlin, Erich Walter Farkas, Andreas Kull, Alexander Smirnow, Optimal risk-sharing across a network of insurance companies, In: Swiss Finance Institute Research Paper, No. 20-52, 2020. (Working Paper)
Risk transfer is a key risk and capital management tool for insurance companies. Transferring risk between insurers is used to mitigate risk and manage capital requirements. We investigate risk transfer in the context of a network environment of insurers and consider capital costs and capital constraints at the level of individual insurance companies. We demonstrate that the optimisation of profitability across the network can be achieved through risk transfer. Considering only individual insurance companies, there is no unique optimal solution and, a priori, it is not clear which solutions are fair. However, from a network perspective, we derive a unique fair solution in the sense of cooperative game theory. Implications for systemic risk are briefly discussed. |
|
Ludovic Mathys, On Extensions of the Barone-Adesi & Whaley Method to Price American-Type Options, In: SSRN, No. 3482064, 2019. (Working Paper)
The present article provides an efficient and accurate hybrid method to price American standard options in certain jump-diffusion models as well as American barrier-type options under the Black & Scholes framework. Our method generalizes the quadratic approximation scheme of Barone-Adesi & Whaley (1987) and several of its extensions. Using perturbative arguments, we decompose the early exercise pricing problem into sub-problems of different orders and solve these sub-problems successively. The obtained solutions are combined to recover approximations to the original pricing problem of multiple orders, with the 0-th order version matching the general Barone-Adesi & Whaley ansatz. We test the accuracy and efficiency of the approximations via numerical simulations. The results show a clear dominance of higher order approximations over their respective 0-th order version and reveal that significantly more pricing accuracy can be obtained by relying on approximations of the first few orders. Additionally, they suggest that increasing the order of any approximation by one generally refines the pricing precision, however that this happens at the expense of greater computational costs. |
|
Redaktion, Erich Walter Farkas, Die Realität ist viel komplexer als die Modell, In: SAV Bulletin, 2 December 2019. (Media Coverage)
Die Welt entwickelt sich in rasantem Tempo weiter und konfrontiert die Menschheit mit immer neuen Herausforderungen. Das Risikomanagement ist davon ganz besonders betroffen. Denn wie sollen Risiken berechnet werden, wenn es aus der Vergangenheit keine relevanten Daten gibt? Prof. Dr. Erich Walter Farkas, Co-Chair of the Board der Swiss Risk Association und Professor of Quantitative Finance an der Universität Zürich, plädiert für holistische Ansätze und länderübergreifende Initiativen. |
|
Yiji He, Deep calibration of Financial Market Risk Models, University of Zurich, Faculty of Business, Economics and Informatics, 2019. (Master's Thesis)
In this thesis, we will develop an extreme value model for four asset classes, which are stocks, real
estate, commodities, and bond spreads.
Standard techniques in the academic field use the extreme value theory, while industry typically will
only use the last one or two crisis and non-crisis data. But they all have their own issues. This
thesis analyzes information from crisis further in the past (e.g. 120 years). It uses this to estimate
the severity of potential future crises. With the goal to address the criticism that financial market
risk models always underestimate the severity of the next crisis, we will use the peak-over-threshold
method to screen the crises and Jackknife to recreate the extreme loss distribution, and finally
generate the model with a resampling method.
The result of this thesis can be used for capital adequacy risk models for regulatory purposes and
internal steering. |
|
Alexander Smirnow, Jana Hlavinová, Systemic intrinsic risk measures in financial networks, In: Vienna Congress on Mathematical Finance (VCMF 2019). 2019. (Conference Presentation)
|
|
Florin Onder, The Cost of Hedging with Options, University of Zurich, Faculty of Business, Economics and Informatics, 2019. (Master's Thesis)
Looking at the publications on the topic of hedging a portfolio with options, one immediately realises that this is not only a heavily discussed topic, but also still relevant in today's time, maybe even especially in today's time, where uncertainty is dominating the global markets. Buying put options to protect a portfolio against heavy losses seems appealing in times, in which the worldwide economy is disrupted by political and geopolitical disputes. For conservative investors, it can be expected that the use of option related strategies that aim at protecting the portfolio against heavy losses, is able to outperform the benchmark on a risk-adjusted return basis with the use of certain risk measures, such as Value-at-Risk and Expected Shortfall.
This thesis concentrates on the analysis of the risk-return figures of various hedging strategies for different types of investors, each with their own level of risk aversion. The hedging techniques range from buying protective put options on different global equity indices, as well as a bond index, given various levels of option moneyness, to selling covered call options on the same indices and levels of moneyness. One of the central aim of this thesis was to construct downside protected trading strategies on heavily traded indices, using static hedging techniques, resulting in strategies that are directly implementable by any type of risk averse investor, not only institutional investors. All trading strategies are compared on a risk-return level to a benchmark index, which consists of each respective underlying index, bought at the beginning of the backtesting period and held until the end of the backtesting period.
The results are presented and tested for their statistical significance, as well as their ability to
outperform the benchmark index on a risk-adjusted basis. We find that for most strategies, at most, only a weak statistical significance exist, that the returns of the individual strategies are different from the benchmark's returns and that most hedging strategies that involve the buying of put options are not able to outperform the benchmark index, even on a risk-adjusted return level. Strategies that incorporate the selling of call options on the other hand, are most often able to generate superior risk-return figures than the benchmark index, but also show little to no statistical significance. |
|
Burak Er, Anwendung eines Stresstests auf das Hypothekar-Portfolio der Basler Kantonalbank bestehend aus Wohnrenditeliegenschaften, University of Zurich, Faculty of Business, Economics and Informatics, 2019. (Bachelor's Thesis)
In dieser Arbeit wird die Basler Kantonalbank einem Hypothekar-Stresst unterzogen. Hierbei wird das Hypothekar-Portfolio, ausschliesslich bestehend aus Wohnrenditeliegenschaften, auf makroökonomische Schocks getestet. Die hierbei gestressten makroökonomischen Risikotreiber sind das BIP, der Zins, die Mietwohnungspreise und die Leerstandsquote. Es wird ein Stressszenario erarbeitet, welches an die Schweizer Immobilienkrise der Neunziger Jahre angelehnt ist. Die Stresstest-Methodik baut auf den theoretischen Ansatz des Modells von der Risk Solution Network AG auf, die bei den von der FINMA aufgetragenenen Stresstests zu Hypothekarrisiken angewandt wird und erweitert diesen um den Risikotreiber Leerstandsquote. Das Modell beruht auf einem Top-Down sowie einem Ein-Faktor Ansatz. Durch die Schocks resultieren gestresste Belehnungswerte und finanzielle Tragbarkeiten. Je nach Belehnungs- und Tragbarkeitsintervall resultieren verschiedene Werte für die LGDs und die PDs. Als Risikogrösse wird der Expected
Loss ermittelt. Anschliessend werden die Ergebnisse diskutiert. |
|