Steve Cicala, David Hémous, Morten G Olsen, Adverse selection as a policy instrument: unraveling climate change, In: NBER Working Paper Series, No. 30283, 2023. (Working Paper)
This paper applies principles of adverse selection to overcome obstacles that prevent the implementation of Pigouvian policies to internalize externalities. Focusing on negative externalities from production (such as pollution), we consider settings in which aggregate emissions are known, but individual contributions are unobserved by the government. We evaluate a policy that gives firms the option to pay a tax on their voluntarily and verifiably disclosed emissions, or pay an output tax based on the average rate of emissions among the undisclosed firms. The certification of relatively clean firms raises the output-based tax, setting off a process of unraveling in favor of disclosure. We derive sufficient statistics formulas to calculate the welfare of such a program relative to mandatory output or emissions taxes. We find that the voluntary certification mechanism would deliver significant gains over output-based taxation in two empirical applications: methane emissions from oil and gas fields, and carbon emissions from imported steel. |
|
Jacob Goeree, Alexey Kushnir, A geometric approach to mechanism design, In: Working paper series / Department of Economics, No. 56, 2013. (Working Paper)
We develop a novel geometric approach to mechanism design using an important result in convex analysis: the duality between a closed convex set and its support function. By deriving the support function for the set of feasible interim values we extend the wellknown Maskin-Riley-Matthews-Border conditions for reduced-form auctions to social choice environments. We next refine the support function to include incentive constraints using a geometric characterization of incentive compatibility. Borrowing results from majorization theory that date back to the work of Hardy, Littlewood, and Pólya (1929) we elucidate the "ironing" procedure introduced by Myerson (1981) and Mussa and Rosen (1978). The inclusion of Bayesian and dominant strategy incentive constraints result in the same support function, which establishes equivalence between these implementation concepts. Using Hotelling's lemma we next derive the optimal mechanism for any social choice problem and any linear objective, including revenue and surplus maximization. We extend the approach to include general concave objectives by providing a fixed-point condition characterizing the optimal mechanism. We generalize reduced-form implementation to environments with multi-dimensional, correlated types, non-linear utilities, and interdependent values. When value interdependencies are linear we are able to include incentive constraints into the support function and provide a condition when the second-best allocation is ex post incentive compatible. |
|
Philippe K Widmer, Peter Zweifel, Mehdi Farsi, Accounting for heterogeneity in the measurement of hospital performance, In: Working paper series / Department of Economics, No. No. 52, 2011. (Working Paper)
With prospective payment of hospitals becoming more common, measuring their performance is gaining in importance. However, the standard cost frontier model yields biased efficiency scores because it ignores technological heterogeneity between hospitals. In this paper, efficiency scores are derived from a random intercept and an extended random parameter frontier model, designed to overcome
the problem of unobserved heterogeneity in stochastic frontier analysis. Using a sample of 100 Swiss hospitals covering the years 2004 to 2007 and applying Bayesian inference, significant heterogeneity is found, suggesting rejection of the standard cost frontier model. Estimated inefficiency decreases even below the 14 percent reported by Hollingsworth (2008) for European countries. Accounting for unobserved heterogeneity would make hospitals rated below 85 percent efficiency according to the standard model gain up to 12 percentage points, serving to highlight the importance of heterogeneity correction in the estimation of hospital performance. |
|
Christoph Winter, Accounting for the Changing Role of Family Income in Determining College Entry, In: Working paper series / Institute for Empirical Research in Economics, No. No. 402, 2011. (Working Paper)
In recent decades, the US has experienced a widening of the college enrolment gap between rich and poor families. This is commonly interpreted as evidence for a tightening of borrowing constraints. This paper asks whether this is indeed the case. I present an incomplete-markets overlapping-generations model with college enrolment, in which altruistic parents provide transfers to their children. In the model the rise in earnings inequality observed between 1980 and 2000 acts as the driving force for generating the trends in the data. With the help of counterfactual experiments, I find that fraction of constrained households is much higher (24 instead of 8 percent) than indicated by the narrow enrolment gap in 1980. Contrary to what the development of the enrolment gap in the data suggests, the share of constrained households actually fell (to 18 percent) between 1980 and 2000. I show that altruism is important for explaining these findings. |
|
Philippe K Widmer, Does prospective payment increase hospital (in)efficiency? Evidence from the Swiss hospital sector, In: Working paper series / Department of Economics, No. No. 53, 2011. (Working Paper)
Several European countries have followed the United States in introducing prospective payment for hospitals with the expectation of achieving cost efficiency gains. This article examines whether theoretical expectations of cost efficiency gains can be empirically confirmed. In contrast to previous studies, the analysis of Switzerland provides a comparison of a retrospective per diem payment system with a prospective global budget and a payment per patient case system. Using a sample of approximately 90 public financed Swiss hospitals during the years 2004 to 2009 and Bayesian inference of a standard and a random parameter frontier model, cost efficiency gains are found, particularly with a payment per patient case system. Payment systems designed to put hospitals at operating risk are more effective than retrospective payment systems. However, hospitals are heterogeneous with respect to their production technologies, making a random parameter frontier model the superior specification for Switzerland. |
|
Boris Krey, Philippe K Widmer, Peter Zweifel, Efficient Provision of Electricity for the United States and Switzerland, In: Working paper series / Socioeconomic Institute, No. No. 0812, 2011. (Working Paper)
This study applies financial portfolio theory to determine efficient frontiers in the provision of electricity for the United States and Switzerland. Expected returns are defined by the rate of productivity increase of power generation (adjusted for external costs), volatility, by its standard deviation. Since unobserved productivity shocks are found to be correlated, Seemingly Unrelated Regression Estimation (SURE) is used to filter out the systematic component of the covariance matrix of the productivity changes. Results suggest that as of 2003, the feasible maximum expected return (MER) electricity portfolio for the United States contains more Coal, Nuclear, and Wind than actual but markedly less Gas and Oil. The minimum variance (MV) portfolio contains markedly more Oil, again more Coal, Nuclear, and Wind but almost no Gas. Regardless of the choice between MER and MV, U.S. utilities are found to lie substantially inside the efficient frontier. This is even more true of their Swiss counterparts, likely due to continuing regulation of electricity markets. |
|
Mathias Hoffmann, Ulrich Woitek, Emerging from the war: gold standard mentality, current accounts and the international business cycle 1885-1939, In: Working paper series / Department of Economics, No. No. 57, 2011. (Working Paper)
We study international business cycles and capital flows in the UK, the United States and the Emerging Periphery in the period 1885-1939. Based on the same set of parameters, our model explains current account dynamics under both the Classical Gold Standard and during the Interwar period. We interpret this as evidence for Gold Standard mentality: the expectation formation mechanism with respect to major macroeconomic variables driving the current account – output, exchange rates and interest rates – has remained fundamentally stable between the two periods. Nonetheless, the macroeconomic environment changed: Volatility increased generally, but less so for international capital flows than for GDP. This pattern is consistent with shocks in the Interwar period becoming more persistent and more global. |
|
Olivier Ledoit, Michael Wolf, Nonlinear Shrinkage Estimation of Large-Dimensional Covariance Matrices, In: Working paper series / Institute for Empirical Research in Economics, No. No. 515, 2011. (Working Paper)
Many statistical applications require an estimate of a covariance matrix and/or its inverse. When the matrix dimension is large compared to the sample size, which happens frequently, the sample covariance matrix is known to perform poorly and may suffer from ill-conditioning. There already exists an extensive literature concerning improved estimators in such situations. In the absence of further knowledge about the structure of the true covariance matrix, the most successful approach so far, arguably, has been shrinkage estimation. Shrinking the sample covariance matrix to a multiple of the identity, by taking a weighted average of the two, turns out to be equivalent to linearly shrinking the sample eigenvalues to their grand mean, while retaining the sample eigenvectors. Our paper extends this approach by considering nonlinear transformations of the sample eigenvalues. We show how to construct an estimator that is asymptotically equivalent to an oracle estimator suggested in previous work. As demonstrated in extensive Monte Carlo simulations, the resulting bona fide estimator can result in sizeable improvements over the sample covariance matrix and also over linear shrinkage. |
|
Alexander Rathke, Tobias Straumann, Ulrich Woitek, Overvalued: Swedish monetary policy in the 1930s, In: Working paper series / Department of Economics, No. No. 58, 2011. (Working Paper)
This paper reconsiders the role of monetary policy in Sweden’s strong recovery from the Great Depression. The Riksbank in the 1930s is sometimes seen as an example of a central bank that was relatively innovative in terms of the conduct of monetary policy. To consider this analytically, we estimate a small-scale, structural general equilibrium model of a small open economy using Bayesian methods. We find that the model captures the key dynamics of the period surprisingly well. Importantly, our findings suggest that Sweden avoided the worst excesses of the depression by conducting conservative rather than innovative monetary policy. We find that, by keeping the Swedish krona undervalued to replenish foreign reserves, Sweden’s exchange rate policy unintentionally contributed to the Swedish growth miracle of the 1930s, avoiding a major slump in 1932 and enabling the country to benefit quickly from the eventual recovery of world demand. |
|
Dominic Rohner, Mathias Thoenig, Fabrizio Zilibotti, Seeds of distrust: conflict in Uganda, In: Working paper series / Department of Economics, No. No. 54, 2011. (Working Paper)
We study the effect of civil conflict on social capital, focusing on the experience of Uganda during the last decade. Using individual and county-level data, we document causal effects on trust and ethnic identity of an exogenous outburst of ethnic conflicts in 2002-04. We exploit two waves of survey data from Afrobarometer 2000 and 2008, including information on socioeconomic characteristics at the individual level, and geo-referenced measures of fighting events from ACLED. Our identification strategy exploits variations in the intensity of fighting both in the spatial and cross-ethnic dimensions. We find that more intense fighting decreases generalized trust and increases ethnic identity. The effects are quantitatively large and robust to a number of control variables, alternative measures of violence, and different statistical techniques involving ethnic and county fixed effects and instrumental variables. We also document that the post-war effects of ethnic violence depend on the ethnic fractionalization. Fighting has a negative effect on the economic situation in highly fractionalized counties, but has no effect in less fractionalized counties. Our findings are consistent with the existence of a self-reinforcing process between conflicts and ethnic cleavages. |
|
Maria Saez Marti, Siesta: a theory of freelancing, In: Working paper series / Department of Economics, No. No. 55, 2011. (Working Paper)
I study the effect of fatigue and innate ability on performance in a model with incomplete contracts, lumpy tasks requiring multiple periods of work and stochastic productivity shocks. I find that increasing ability or reducing fatigue does not lead necessarily to more productive efficiency, since it may exacerbate the incentive for agents take "too much" on-the-job leisure. In a world with heterogenous agents, the problem may be ameliorated by the introduction of a dual labour market with free-lancers (who can take breaks at their discretion) and regular workers (who work on a fixed schedule). |
|
Sigrid Röhrs, Christoph Winter, Wealth inequality and the optimal level of government debt, In: Working paper series / Department of Economics, No. No. 51, 2011. (Working Paper)
In this paper, we quantitatively analyze to what extent a benevolent government should issue debt in a model where households are subject to idiosyncratic productivity shocks, insurance markets are missing and borrowing is restricted. In this environment, issuing government bonds facilitates saving for self-insurance. Despite this, we find that in a calibrated version of the model that is consistent with the skewed wealth and earnings distribution observable in the U.S., the government should buy private bonds, and not issue public debt in the long run. The reason is that in the U.S., a large fraction of the population has almost no wealth or is even in debt. The wealth-poor, however, do not profit from an increase in the interest rate following an increase in public debt. Instead, they gain from higher wages that result from a reduction in debt. We show that even when the short run costs of higher capital taxation are taken into account, it still pays off to reduce government debt on overall. Moreover, we find that endogenizing household’s borrowing constraints by assuming limited commitment leads to even higher asset levels being optimal in the long run. |
|
Claudia M Buch, Cathérine Koch, Michael Koetter, Margins of international banking: Is there a productivity pecking order in banking, too?, In: Series 2: Banking and Financial Studies, No. 12/2009, 2009. (Working Paper)
Modern trade theory emphasizes firm-level productivity differentials to explain
the cross-border activities of non-financial firms. This study tests whether a
productivity pecking order also determines international banking activities. Using
a novel dataset that contains all German banks’ international activities, we
estimate the ordered probability of a presence abroad (extensive margin) and the
volume of international assets (intensive margin). Methodologically, we enrich the
conventional Heckman selection model to account for the self-selection of banks
into different modes of foreign activities using an ordered probit. Four main
findings emerge. First, similar to results for non-financial firms, a productivity
pecking order drives bank internationalization. Second, only a few non-financial
firms engage in international trade, but many banks hold international assets, and
only a few large banks engage in foreign direct investment. Third, in addition to
productivity, risk factors matter for international banking. Fourth, gravity-type
variables have an important impact on international banking activities. |
|
Elisabetta Fiorentino, Cathérine Koch, Winfried Rudek, Microdatabase: External position reports of German banks, In: Technical documentation, No. -, 2010. (Working Paper)
|
|
Gregori Baetschmann, Rainer Winkelmann, Modelling zero-inflated count data when exposure varies: with an application to sick leave, In: Working paper series / Department of Economics, No. 61, 2012. (Working Paper)
This paper is concerned with the analysis of zero-inflated count data when time of exposure varies. It proposes a new zero-inflated count data model that is based on two homogeneous Poisson processes and accounts for exposure time in a theory consistent way. The new model is used in an application to the effect of insurance generosity on the number of absent days. |
|
Carlos Alos-Ferrer, Nick Netzer, Robust stochastic stability, In: Working paper series / Department of Economics, No. 63, 2014. (Working Paper)
A strategy profile of a game is called robustly stochastically stable if it is stochastically stable for a given behavioral model independently of the specification of revision opportunities and tie-breaking assumptions in the dynamics. We provide a simple radius-coradius result for robust stochastic stability and examine several applications. For the logit-response dynamics, the selection of potential maximizers is robust for the subclass of supermodular symmetric binary-action games. For the mistakes model, the weaker property of strategic complementarity suffices for robustness in this class of games. We also investigate the robustness of the selection of risk-dominant strategies in coordination games under best-reply and the selection of Walrasian strategies in aggregative games under imitation. |
|
Dan Wunderli, Controlling the danger of false discoveries in estimating multiple treatment effects, In: Working paper series / Department of Economics, No. 60, 2012. (Working Paper)
I expose the risk of false discoveries in the context of multiple treatment effects. A false discovery is a nonexistent effect that is falsely labeled as statistically significant by its individual t-value. Labeling nonexistent effects as statistically significant has wide-ranging academic and policy-related implications, like costly false conclusions from policy evaluations. I eexamine an empirical labor market model by using state-of-the art multiple testing methods and I provide simulation evidence. By merely using individual t-values at conventional significance levels, the risk of labeling probably nonexistent treatment effects as statistically significant is unacceptably high. Individual t-values even label a number of treatment effects as significant, whereas multiple testing indicates false discoveries in these cases. Tests of a joint null hypothesis such as the well-known F-test control the risk of false discoveries only to a limited extent and do not optimally allow for rejecting individual hypotheses. Multiple testing methods control the risk of false discoveries in general while allowing for individual decisions in the sense of rejecting individual hypotheses. |
|
Giovanna D'Adda, Leadership and influence: Evidence from an artefactual field experiment on local public good provision, In: Working paper series / Department of Economics, No. 59, 2012. (Working Paper)
This paper studies the effect of leadership on the level and evolution of pro-social behavior using an artefactual field experiment on local public good provision. Participants decide how much to contribute to an actual conservation project. They can then revise their donations after being randomly matched in pairs on the basis of their authority and having observed each other’s contributions. Authority is measured through a social ranking exercise identifying formal and moral leaders within the community. I find that giving by a pair is higher and shows a lower tendency to decrease over time when a leader is part of a pair. This is because higher-ranked pair members in general, and leaders in particular, donate more and are less likely to revise contributions downwards after giving more than their counterparts. Leadership effects are stronger when moral authority is made salient within the experiment, in line with the ethical nature of the decision under study. These findings highlight the importance of identifying different forms of leadership and targeting the relevant leaders in projects aimed at local public good provision. |
|
Gregori Baetschmann, Kevin E Staub, Rainer Winkelmann, Reconsidering the analysis of longitudinal happiness data - with an application to the effect of unemployment, In: Working paper series / Department of Economics, No. No. 4, 2011. (Working Paper)
The paper reconsiders existing estimators for the panel data fixed effects ordered logit model, including one that has not been used in econometric studies before, and studies
the small sample properties of these estimators in a series of Monte Carlo simulations. There are two main findings. First, we show that some of the estimators used in the literature are inconsistent. Second, the new estimator seems to be more immune to small sample bias than other consistent estimators and is easy to implement. The empirical relevance is illustrated in an application to the effect of unemployment on happiness. Choosing the right estimator avoids a bias of up to 30 percent in key parameters. |
|
Michael Wolf, Dan Wunderli, Bootstrap joint prediction regions, In: Working paper series / Department of Economics, No. 64, 2013. (Working Paper)
Many statistical applications require the forecast of a random variable of interest over several periods into the future. The sequence of individual forecasts, one period at a time, is called a path forecast, where the term path refers to the sequence of individual future realizations of the random variable. The problem of constructing a corresponding joint prediction region has been rather neglected in the literature so far: such a region is supposed to contain the entire future path with a prespecified probability. We develop bootstrap methods to construct joint prediction regions. The resulting regions are proven to be asymptotically consistent under a mild high-level assumption. We compare the finitesample performance of our joint prediction regions to some previous proposals via Monte Carlo simulations. An empirical application to a real data set is also provided. |
|