Johannes Brumm, Felix Kübler, Michael Grill, Karl Schmedders, Re-use of collateral: Leverage, volatility, and welfare, In: Swiss Finance Institute Research Paper, No. 17-04, 2017. (Working Paper)
We assess the quantitative implications of the re-use of collateral on financial market leverage, volatility, and welfare within an infinite-horizon asset-pricing model with heterogeneous agents. In our model, the ability of agents to re-use frees up collateral that can be used to back more transactions. Re-use thus contributes to the build-up of leverage and significantly increases volatility in financial markets. When introducing limits on re-use, we find that volatility is strictly decreasing as these limits become tighter, yet the impact on welfare is non-monotone. In the model, allowing for some re-use can improve welfare as it enables agents to share risk more effectively. Allowing reuse beyond intermediate levels, however, can lead to excessive leverage and lower welfare. So the analysis in this paper provides a rationale for limiting, yet not banning, re-use in financial markets. |
|
Santiago Moreno-Bromberg, Guillaume Roger, Scale Effects in Dynamic Contracting, In: Swiss Finance Institute Research Paper, No. 15-49 , 2016. (Working Paper)
We study a continuous-time contracting problem in which size plays a role. The agent may take on excessive risk to enhance short-term gains; doing so exposes the principal to large, infrequent losses. The optimal contract includes size as an instrument: downsizing along the equilibrium path may be necessary so as to preserve incentive compatibility. We characterize the principal's value function and the downsizing process, both of which depend on the nature of the liquidation value. When the latter has fixed and size-dependent components, there is an optimal (endogenous) liquidation size. In the special case where the liquidation value is linear in size, one may describe the solution in size-adjusted terms, which allows for the study of re-investment. The optimal contract is implemented using the full array of financial securities plus debt covenants; holding equity is essential to curb risk taking. Conflicts emerge between classes of security holders and explain phenomena like seniority of claims. Firms for which risk taking is less attractive can afford a higher leverage. |
|
Jorge Abad, Marco D'Errico, Neill Killeen, Vera Luz, Tuomas Peltonen, Richard Portes, Teresa Urbano, Mapping the interconnectedness between EU banks and shadow banking entities, In: NBER Working Paper Series, No. 23280, 2017. (Working Paper)
This paper provides a unique snapshot of the exposures of EU banks to shadow banking entities within the global financial system. Drawing on a rich and novel dataset, the paper documents the cross-sector and cross-border linkages and considers which are the most relevant for systemic risk monitoring. From a macroprudential perspective, the identification of potential feedback and contagion channels arising from the linkages of banks and shadow banking entities is particularly challenging when shadow banking entities are domiciled in different jurisdictions. The analysis shows that many of the EU banks’ exposures are towards non-EU entities, particularly US-domiciled shadow banking entities. At the individual level, banks’ exposures are diversified although this diversification leads to high overlap across different types of shadow banking entities. |
|
Quan Zhang, Recovery is Never Easy - Dynamics and Multiple Equilibria with Financial Arbitrage, Production and Collateral Constraints, In: Swiss Finance Institute Research Paper, No. 17-02, 2017. (Working Paper)
We develop a simple general equilibrium model to study the interactions between financial arbitrage and the real economy under collateral constraints. In good times, arbitrage activities help boost the production sectors by providing external funds to capital investment. However, when exposed to adverse shocks and panic market reactions, arbitrage also amplifies the financial distress and makes it easier for the economy to fall into self-fulfilling crises. Moreover, the possibility of regime switches triggered by exogenous shocks also complicates the path to recovery. The combination of financial distress and pessimistic market anticipation not only slows down the recovery process, but also can trap the economy in a less healthy steady state. |
|
Mathias Beck, Cindy Lopes Bento, Innovation outcomes and partner-type selection in R&D Alliances: The role of simultaneous diversification and sequential adaptation, In: UZH Business Working Paper Series, No. 363, 2016. (Working Paper)
This study focuses on how firms form and sequentially adapt their inter organizational knowledge sourcing structures within research and development (R&D) alliances and how this process impacts their innovation performance. In contrast to the previous literature that mainly ignores the dynamic aspects of how firms adapt their search strategies, our approach accounts for sequential adaptation. Our proposed framework explores the role of simultaneous diversification and sequential adaptation of collaboration partners within R&D alliances according to specific innovation outcomes. The results emphasize that firms should not remain within the same search activities indefinitely, as non-adapting interorganizational knowledge transfer structures lead to inferior performance. Notably, this study highlights important partner-type selectivity and identifies appropriate simultaneous diversification and sequential adaptation strategies in relation to specific innovation outcomes and firm sizes. |
|
Marco D'Errico, Tarik Roukny, Compressing over-the-counter markets, In: European Systemic Risk Board Working Paper Series (ESRB), No. 44, 2017. (Working Paper)
In this paper, we show both theoretically and empirically that the size of over-the-counter (OTC) markets can be reduced without affecting individual net positions. First, we find that the networked nature of these markets generates an excess of notional obligations between the aggregate gross amount and the minimum amount required to satisfy each individual net position. Second, we show conditions under which such excess can be removed. We refer to this netting operation as compression and identify feasibility and effciency criteria, highlighting intermediation as the key element for excess levels. We show that a trade-off exists between the amount of notional that can be eliminated from the system and the conservation of original trading relationships. Third, we apply our framework to a unique and comprehensive transaction-level dataset on OTC derivatives including all firms based in the European Union. On average, we find that around 75% of market gross notional relates to excess. While around 50% can in general be removed via bilateral compression, more sophisticated multilateral compression approaches are substantially more effcient. In particular, we find that even the most conservative multilateral approach which satisfies relationship constraints can eliminate up to 98% of excess in the markets. |
|
Marco D'Errico, Stefano Battiston, Tuomas Peltonen, Martin Scheicher, How does risk flow in the credit default swap market?, In: European Systemic Risk Board Working Paper Series (ESRB), No. 33, 2016. (Working Paper)
|
|
Nataliya Klimenko, Jean-Charles Rochet, Gianni De Nicolo, Sebastian Pfeil, Aggregate Bank Capital and Credit Dynamics, In: Swiss Finance Instiute Research Paper, No. 16-42, 2016. (Working Paper)
We develop a novel dynamic model of banking showing that aggregate bank capital is an important determinant of bank lending. In our model commercial banks finance their loans with deposits and equity, while facing equity issuance costs. Because of this financial friction, banks build equity buffers to absorb negative shocks. Aggregate bank capital determines the dynamics of lending. Notably, the equilibrium loan rate is a decreasing function of aggregate capitalization. The competitive equilibrium is constrained inefficient, because banks do not internalize the consequences of individual lending decisions for the future loss-absorbing capacity of the banking sector. In particular, we find that unregulated banks lend too much. Imposing a minimum capital ratio helps tame excessive lending, which enhances stability of the banking system. |
|
Steven Ongena, Olivier De Jonghe, Hans Dewachter, Klaas Mulier, Glenn Schepens, Funding shocks and banks' credit reallocation, In: SFI Practitioner Roundups December 2016, No. 12/16, 2016. (Working Paper)
This paper provides evidence on the strategic lending decisions made by banks facing a negative funding shock. Using bank-firm level credit data, we show that banks reallocate credit within their domestic loan portfolio in at least three different ways. First, banks reallocate to sectors where they have high sector presence. Second, they also reallocate to sectors in which they are heavily specialized. Third, they reallocate credit towards low-risk firms. These reallocation effects are economically large. A standard deviation improvement in sector presence, sector specialization or firm risk reduces the transmission of the funding shock to credit supply by 20, 13 and 10%, respectively. We also provide insight in the timing of these reallocation decisions. Reallocation to sectors in which a bank has a high sector presence is almost instantaneous, while sector specialization starts playing a role four to five months after the shock. |
|
Gregor Philipp Reich, Divide and Conquer: Recursive Likelihood Function Integration for Hidden Markov Models with Continuous Latent Variables, In: SSRN, No. 2794884, 2016. (Working Paper)
This paper develops a method to efficiently estimate hidden Markov models with continuous latent variables using maximum likelihood estimation. To evaluate the (marginal) likelihood function, I decompose the integral over the unobserved state variables into a series of lower dimensional integrals, and recursively approximate them using numerical quadrature and interpolation. I show that this procedure has very favorable numerical properties:
First, the computational complexity grows linearly in time, which makes the integration over hundreds and thousands of periods well feasible.
Second, I prove that the numerical error is accumulated sub-linearly over time; consequently, using highly efficient and fast converging numerical quadrature and interpolation methods for low and medium dimensions, such as Gaussian quadrature and Chebyshev polynomials, the numerical error can be well controlled even for very large numbers of periods.
Lastly, I show that the numerical convergence rates of the quadrature and interpolation methods are preserved up to a factor of at least 0.5 under appropriate assumptions.
I apply this method to the bus engine replacement model of Rust: first, I verify the algorithm’s ability to recover the parameters in an extensive Monte Carlo study with simulated datasets; second, I estimate the model using the original dataset. |
|
Robert Göx, Relative Performance Evaluation in Presence of Exposure Risk, In: SSRN, No. 2478554, 2016. (Working Paper)
I study the consequences of a random exposure to common risk for the purpose of relative performance evaluation (RPE) and find that it significantly affects the usefulness and the empirical measurement of RPE. According to my analysis, the magnitude of the exposure risk not only determines how firms aggregate measures of common risk with measures of firm performance but also the extent to which the firms can control the impact of common risk on their own performance. Simulated regressions of my theoretical model indicate that a high exposure risk can prevent the correct identification of informative performance signals and cause a biased composition of customized peer groups. A high exposure risk also increases the likelihood of a type II error in implicit RPE tests. I evaluate two empirical strategies to control for the magnitude of the exposure risk and find that they significantly reduce the likelihood of a type II error. |
|
Sabine Elmiger, A Heterogeneous-Agent Foundation of the Representative-Agent Approach, In: Swiss Finance Institute Research Paper, No. 16-58, 2016. (Working Paper)
The representative-agent approach is widely used in consumption-based asset pricing. From a theoretical point of view, quite restrictive assumptions on the underlying economy are needed for asset prices to depend only on aggregate consumption. A heterogeneous-agent financial market model is presented with all investors following simple rebalancing rules where aggregation already fails. A meaningful specification of a representative agent is still possible: Instead of asset prices per se the representative agent indicates the direction in which relative asset prices tend to in expectation from one time period to the next. The objective function of the representative agent is independent of the set of rebalancing rules participating in the market. |
|
Maximilian Adelmann, Karl Schmedders, János Mayer, A Large-Scale Optimization Model for Replicating Portfolios in the Life Insurance Industry, In: Swiss Finance Inst, No. 16-04, . (Working Paper)
Replicating portfolios have recently emerged as an important tool in the life insurance industry, used for the valuation of companies' liabilities. This paper presents a replicating portfolio (RP) model for approximating life insurance liabilities as closely as possible. We minimize the L1 error between the discounted life insurance liability cash flows and the discounted RP cash flows over a multi-period time horizon for a broad range of different future economic scenarios. We apply two different linear reformulations of the L1 problem to solve large-scale RP optimization problems and also present several out-of-sample tests for assessing the quality of RPs. A numerical application of our RP model to empirical data sets demonstrates that the model delivers RPs that match the liabilities rather closely. The numerical analysis demonstrates that our model delivers RPs with excellent practical properties in a reasonable amount of time. We complete the paper with a description of an implementation of the RP model at a global insurance company. |
|
Katharina Jaik, Stefan C Wolter, Lost in transition: the influence of locus of control on delaying educational decisions, In: Swiss Leading House "Economics of Education" Working Paper, No. 118, 2016. (Working Paper)
The transition from compulsory schooling to upper-secondary education is a crucial and frequently difficult step in the educational career of young people. In this study, we analyze the impact of one non-cognitive skill, locus of control, on the intention and the decision to delay the transition into post-compulsory education in Switzerland. We find that locus of control, measured at ages 13–14, has a significant impact on the intention to delay the transition into upper-secondary education. Furthermore, we find that the intention to delay the transition is strongly correlated with the actual delay, measured one and a half years after the intention. Finally, students with the initial intention to delay but successfully continuing into upper-secondary education show a stronger internal locus of control than comparable students who do delay their transition. |
|
Markus Leippold, Nikola Vasiljevic, Option-Implied Intra-Horizon Risk and First-Passage Disentanglement, In: SSRN, No. 2804702, 2016. (Working Paper)
We study the intra-horizon value at risk (iVaR) in a general jump diffusion setup and propose a new model of asset returns called displaced mixed-exponential model, which can arbitrarily closely approximate finite-activity jump-diffusions and completely monotone Levy processes. We derive analytical results for the iVaR and disentangle the risk contribution of jumps from diffusion. Estimating the iVaR for several popular jump models using on S&P 100 option data, we find that option-implied estimates are much more responsive to market changes relative to their historical counterparts. Moreover, disentangling jumps from diffusion, jump account for about 90 percent of iVaR on average. |
|
Gabriele Visentin, Stefano Battiston, Marco D'Errico, Rethinking Financial Contagion, In: SSRN, No. 2831143, 2015. (Working Paper)
How, and to what extent, does an interconnected financial system endogenously amplify external shocks? This paper attempts to reconcile some apparently different views emerged after the 2008 crisis regarding the nature and the relevance of contagion in financial networks. We develop a common framework encompassing several network contagion models and show that, regardless of the shock distribution and the network topology, precise ordering relationships on the level of aggregate systemic losses hold among models.
We argue that the extent of contagion crucially depends on the amount of information that each model assumes to be available to agents. Under no uncertainty about the network structure and values of external assets, the well-known Eisenberg and Noe (2001) model applies, which delivers the lowest level of contagion. This is due to a property of loss conservation: aggregate losses after contagion are equal to the losses incurred by those institutions initially hit by a shock. This property implies that many contagion analyses rule out by construction any loss amplification, treating de facto an interconnected system as a single aggregate entity, where losses are simply mutualised. Under higher levels of uncertainty, as captured for instance by the DebtRank model, losses become non-conservative and get compounded through the network. This has important policy implications: by reducing the levels of uncertainty in times of distress (e.g. by obtaining specific data on the network) policymakers would be able to move towards more conservative scenarios. Empirically, we compare the magnitude of contagion across models on a sample of the largest European banks during the years 2006- 2016. In particular, we analyse contagion effects as a function of the size of the shock and the type of external assets shocked. |
|
Elena Carletti, Steven Ongena, Jan-Peter Siedlarek, Giancarlo Spagnolo, The impact of merger legislation on bank mergers, In: Swiss Finance Institute Research Paper, No. 16-33, 2016. (Working Paper)
We find that stricter merger control legislation increases abnormal announcement returns of targets in bank mergers by 7 percentage points. Analyzing potential explanations for this result, we document an increase in the pre-merger profitability of targets, a decrease in the size of acquirers and a decreasing share of transactions in which banks are acquired by other banks. Other merger properties, including the size and risk profile of targets, the geographic overlap of merging banks and the stock market response of rivals appear unaffected. The evidence suggests that the strengthening of merger control leads to more efficient and more competitive transactions. |
|
Jakub Rojcek, Ramazan Gençay, Soheil Mahmoodzadeh, Michael C Tseng, Price Impact of Aggressive Liquidity Provision, In: Swiss Finance Institute Research Paper, No. 16-21, 2016. (Working Paper)
This paper analyzes brief episodes of high-intensity quotes turnover and revision-"bursts" in quotes-in the U.S. equity market. Such events occur very frequently, around 400 times a day for actively traded stocks. We find significant price impact associated to this market-maker initiated event, about five times higher than during non-burst periods. Bursts in quotes are concurrent with short-lived structural break in the informational relationship between market makers and market takers. During bursts, market makers no longer passively impound information from order flow into quotes-a departure from traditional market microstructure paradigm. Rather, market makers significantly impact prices during bursts in quotes. Further analysis shows that there is asymmetry in adverse selection between the bid and ask sides of the limit order book and only a sub-population of market makers enjoy an informational advantage during bursts. Our results call attention to the need for a new microstructure perspective in understanding modern high-frequency limit order book markets. |
|
Jakub Rojcek, Alexandre Ziegler, High-Frequency Trading in Limit Order Markets: Equilibrium Impact and Regulation, In: Swiss Finance Institute Research Paper, No. 15-23, 2016. (Working Paper)
We investigate the impact of high-frequency trading (HFT) on market quality and investor welfare using a general limit order book model. We find that while the presence of HFT always improves market quality under symmetric information, under asymmetric information this is the case only if competition between high-frequency traders is sufficiently strong. While HFT does not negatively impact investor welfare, it reduces the welfare of slow speculators. The flexibility of the model allows investigating the effect of the main recent regulatory initiatives designed to curb HFT on market quality and investor welfare. We consider time-in-force rules, cancellation fees, transaction taxes, rebate fee structures, and speed bumps. While some of these regulations lead to improvements in a number of market quality measures, this generally does not translate into higher welfare for long-term investors. Rather, the main effect of such regulations is to generate wealth transfers from high-frequency traders to slow speculators. These regulations therefore appear inadequate to enhance investor welfare in the presence of HFTs. Of the different measures, transaction taxes are the least harmful; while they reduce welfare roughly by the amount of the tax, they do not significantly worsen market quality. The common practice by exchanges of granting rebates to limit orders is detrimental to market quality and investor welfare, causing both higher effective spreads and longer execution times. |
|
Paolo Barucca, Marco Bardoscia, Fabio Caccioli, Marco D'Errico, Gabriele Visentin, Stefano Battiston, Guido Caldarelli, Network Valuation in Financial Systems, In: SSRN, No. 2795583, 2016. (Working Paper)
We introduce a network valuation model (hereafter NEVA) for the ex-ante valuation of claims among financial institutions connected in a network of liabilities. Similar to previous work, the new framework allows to endogenously determine the recovery rate on all claims upon the default of some institutions. In addition, it also allows to account for ex-ante uncertainty on the asset values, in particular the one arising when the valuation is carried out at some time before the maturity of the claims. The framework encompasses as special cases both the ex-post approaches of Eisenberg and Noe and its previous extensions, as well as the ex-ante approaches, in the sense that each of these models can be recovered exactly for special values of the parameters. We characterize the existence and uniqueness of the solutions of the valuation problem under general conditions on how the value of each claim depends on the equity of the counterparty. Further, we define an algorithm to carry out the network valuation and we provide sufficient conditions for convergence to the maximal solution. |
|