Jacob Goeree, Charles A Holt, Karen Palmer, Wiliam Shobe, Dallas Burtraw, An Experimental Study of Auctions versus Grandfathering to Assign Pollution Permits, In: Working paper series / Institute for Empirical Research in Economics, No. No. 429, 2009. (Working Paper)
 
We experimentally study auctions versus grandfathering in the initial assignment of pollution permits that can be traded in a secondary spot market. Low and high emitters compete for permits in the auction, while permits are assigned for free under grandfathering. In theory, trading in the spot market should erase inefficiencies due to initial mis-allocations.nIn the experiment, high emitters exercise market power in the spot market and permit holdings under grandfathering remain skewed towards high emitters. Furthermore, the opportunity costs of “free” permits are fully “passed through.” In the auction, the majority of permits are won by low emitters, reducing the need for spot-market trading. Auctions generate higher consumer surplus and slightly lower product prices in the laboratory markets. Moreover, auctions eliminate the large “windfall profits” that are observed in the treatmentnwith free, grandfathered permit allocations. |
|
Jacob Goeree, Leeat Yariv, An Experimental Study of Jury Deliberation, In: Working paper series / Institute for Empirical Research in Economics, No. No. 438, 2009. (Working Paper)
 
We study the effects of deliberation on collective decisions. In a series of experiments, we vary groups' preference distributions (between common and conflicting interests) and the institutions by which decisions are reached (simple majority, two-thirds majority, and unanimity). When deliberation is prohibited, different institutions generate significantly different outcomes, tracking the theoretical comparative statics. Deliberation, however, significantly diminishes institutional differences and uniformly improves efficiency. Furthermore, communication protocols exhibit an array of stable attributes: messages are public, consistently reveal private information, provide a good predictor for ultimate group choices, and follow particular (endogenous)nsequencing. |
|
Christoph Brunner, Jacob Goeree, Charles A Holt, John O Ledyard, An Experimental Test of Flexible Combinatorial Spectrum Auction Formats, In: Working paper series / Institute for Empirical Research in Economics, No. No. 431, 2009. (Working Paper)
 
"This paper reports laboratory experiments that evaluate the performance of a flexible packagebidding format developed by the FCC, in comparison with other combinatorial formats. In general, the interest of policy makers in combinatorial auctions is justified by the laboratory data; when value complementarities are present, package bidding yields improved performance. We find clear differences among the combinatorial auction formats, however, both in terms of efficiency and seller revenue. Notably, the combinatorial clock provides the highest revenue. The FCC’s flexible package bidding format performed worse than the alternatives, which is one of the main reasons why it was not implemented." |
|
Patrick Eugster, Michèle Sennhauser, Peter Zweifel, Capping Risk Adjustment?, In: Working paper series / Socioeconomic Institute, No. No. 915, 2009. (Working Paper)
 
When premiums are community-rated, risk adjustment (RA) serves to mitigate competitive insurers’ incentive to select favorable risks. However, unless fully prospective, it also undermines their incentives for efficiency. By capping its volume, one may try to counteract this tendency, exposing insurers to some financial risk. This in term runs counter the quest to refine the RA formula, which would increase RA volume. Specifically, the adjuster, ”Hospitalization or living in a nursing home during the previous year” will be added in Switzerland starting 2012. This paper investigates how to minimize the opportunity cost of capping RA in terms of increased incentives for risk selection. |
|
Michelle S. Sovinsky, John C Ham, Daniela Iorio, Caught in the bulimic trap? Persistence and state dependence of bulimia among young women, In: Working paper series / Institute for Empirical Research in Economics, No. 447, 2012. (Working Paper)
 
Eating disorders are an important and growing health concern, and bulimia nervosa (BN) accounts for the largest fraction of eating disorders. Health consequences of BN are substantial and especially serious given the increasingly compulsive nature of the disorder. However, remarkably little is known about the mechanisms underlying the persistent nature of BN. Using a unique panel data set on young women and instrumental variable techniques, we document that unobserved heterogeneity plays a role in the persistence of BN, but strikingly up to two thirds is due to true state dependence. Our results, together with support from the medical literature, provide evidence that bulimia should be considered an addiction. Our findings have important implications for public policy since they suggest that the timing of the policy is crucial: preventive educational programs should be coupled with more intense (rehabilitation) treatment at the early stages of bingeing and purging behaviors. Our results are robust to different model specifications and identifying assumptions. |
|
Dallas Burtraw, Jacob Goeree, Charles A Holt, Erica Myers, Karen Palmer, William Shobe, Collusion in Auctions for Emissions Permits: An Experimental Study, In: Working paper series / Institute for Empirical Research in Economics, No. No. 434, 2009. (Working Paper)
 
Environmental markets have several institutional features that provide a new context for the use of auctions and that have not been studied previously. This paper reports on laboratory experiments testing three auction forms — uniform and discriminatory price sealed-bid auctions and an ascending clock auction. We test the ability of subjects to tacitly or explicitly collude in order to maximize profits. Our main result is that the discriminatory and uniform price auctions produce greater revenues than the clock auction, both with and without explicit communication. The clock appears to facilitate successful collusion, both because of its sequential structure and because it allows bidders to focus on one dimension of cooperation (quantity) rather than two (price and quantity). |
|
Joseph P Romano, Azeem M Shaikh, Michael Wolf, Consonance and the Closure Method in Multiple Testing, In: Working paper series / Institute for Empirical Research in Economics, No. No. 446, 2009. (Working Paper)
 
Consider the problem of testing s hypotheses simultaneously. In order to deal with the multiplicity problem, the classical approach is to restrict attention to procedures that control the familywise error rate (FWE). Typically, it is known how to construct tests of the individual hypotheses, and the problem is how to combine them into a multiple testing procedure that controls the FWE. The closure method of Marcus et al. (1976), in fact, reduces the problem of constructing multiple test procedures which control the FWE to the construction of single tests which control the usual probability of a Type 1 error. The purpose of this paper is to examine the closure method with emphasis on the concepts of coherence and consonance. It was shown by Sonnemann and Finner (1988) that any incoherent procedure can be replaced by a coherent one which is at least as good. The main point of this paper is to show a similar result for dissonant and consonant procedures. We illustrate the idea of how a dissonant procedure can be strictly improved by a consonant procedure in the sense of increasing the probability of detecting a false null hypothesis while maintaining control of the FWE. We then show how consonance can be used in the construction of some optimal maximin procedures. |
|
Christian Ewerhart, Cournot Oligopoly and Concavo-Concave Demand, In: Working paper series / Institute for Empirical Research in Economics, No. No. 427, 2009. (Working Paper)
 
The N-firm Cournot model with general technologies is reviewed to derive generalized and unified conditions for existence of a pure strategy Nash equilibrium. Tight conditions are formulated alternatively (i) in terms of concavity of two-sided transforms of inverse demand, or (ii) as linear constraintsnon the elasticities of inverse demand and its first derivative. These conditions hold, in particular, if a firm’s marginal revenue decreases in other firms’ aggregate output, or if inverse demand is logconcave. The analysis relies on lattice-theoretic methods, engaging both cardinal and ordinal notions of supermodularity. As a byproduct, a powerful test for strict quasiconcavitynis obtained. |
|
Jacob Goeree, Theo Offermann, Randolph Sloof, Demand Reduction and Preemptive Bidding inMulti-Unit License Auctions, In: Working paper series / Institute for Empirical Research in Economics, No. No. 430, 2009. (Working Paper)
 
Multi-unit ascending auctions allow for equilibria in which bidders strategically reduce their demand and split the market at low prices. At the same time, they allow for preemptive bidding by incumbent bidders in a coordinated attempt to exclude entrants fromnthe market. We consider an environment where both demand reduction and preemptivenbidding are supported as equilibrium phenomena of the ascending auction. In a series of experiments, we compare its performance to that of the discriminatory auction. Strategic demand reduction is quite prevalent in the ascending auction even when entry imposes a (large) negative externality on incumbents. As a result, the ascending auction performsnworse than the discriminatory auction both in terms of revenue and efficiency, while entrants.chances are similar across the two formats. |
|
Andreas Kuhn, Demand for Redistribution, Support for the Welfare State, and Party Identification in Austria, In: Working paper series / Institute for Empirical Research in Economics, No. No. 440, 2009. (Working Paper)
 
This paper describes subjective wage inequality and the demand for redistribution in Austrianusing individuals' estimates of occupational wages from the International Social SurveynProgram. Although these estimates differ widely across individuals, the data clearly show that most individuals would like to decrease wage inequality, relative to the level of inequality which they perceive to exist. The empirical analysis also shows that the demand for redistribution is strongly associated not only with variables describing self-interested motives for redistribution, but also with perceptions of and social norms with respect to inequality. Further, the demand for redistribution is a strong predictor for whether annindividual is supportive of redistribution by the state. On the other hand, however, I find almost no evidence for an empirical association between the demand for redistribution and individuals' party identification. |
|
Michelle S. Sovinsky, Eric Helland, Do research joint ventures serve a collusive function?, In: Working paper series / Institute for Empirical Research in Economics, No. 448, 2012. (Working Paper)
 
Every year thousands of firms are engaged in research joint ventures (RJV), where all knowledge gained through R&D is shared among members. Most of the empirical literature assumes members are non-cooperative in the product market. But many RJV members are rivals leaving open the possibility that firms may form RJVs to facilitate collusion. We examine this by exploiting variation in RJV formation generated by a policy change that affects the collusive benefits but not the research synergies associated with a RJV. We use data on RJVs formed between 1986 and 2001 together with firm-level information from Compustat to estimate a RJV participation equation. After correcting for the endogeneity of R&D and controlling for RJV characteristics and firm attributes, we find the decision to join is impacted by the policy change. We also find the magnitude is significant: the policy change resulted in an average drop in the probability of joining a RJV of 34% among telecommunications firms, 33% among computer and semiconductor manufacturers, and 27% among petroleum refining firms. Our results are consistent with research joint ventures serving a collusive function. |
|
Kenneth C Wilbur, Michelle S. Sovinsky, Geert Ridder, Effects of Advertising and Product Placement on Television Audiences, In: Working paper series / Institute for Empirical Research in Economics, No. No. 449, 2009. (Working Paper)
 
Digital video recorder proliferation and new commercial audience metrics are making television networks’ revenues more sensitive to audience losses from advertising. There is currently limited understanding of how traditional advertising and product placement affectntelevision audiences. We estimate a random coefficients logit model of viewing demand for television programs, wherein time given to advertising and product placement plays a role akinnto the “price” of consuming a program. Our data include audience, advertising, and program characteristics from more than 10,000 network-hours of prime-time broadcast television from 2004 to 2007. We find that the median effect of a 10% rise in advertising time is a 15% reduction in audience size. We find evidence that creative strategy and product category are importantndeterminants of viewer response to advertising. When we control for program episode quality,nwe find that product placement time decreases viewer utility. In sum, our results imply thatnnetworks should give price discounts to those advertisers whose ads are most likely to retain viewers’ interest throughout the commercial break. |
|
Johannes Schoder, Michèle Sennhauser, Peter Zweifel, Fine Tuning of Health Insurance Regulation: Unhealthy Consequences for an Individual Insurer, In: Working paper series / Socioeconomic Institute, No. No. 916, 2009. (Working Paper)
 
This paper sheds light on some unexpected consequences of health insurance regulation that may pose a big challenge to insurers’ risk management. Because mandated uniform contributions to health insurance trigger risk selection efforts risk adjustment (RA) schemes become necessary. A good deal of research into the optimal RA formula has been performed (Ellis and Van de Ven [2000]). A recent proposal has been to add ”Hospitalization exceeding three days during the previous year” as an indicator of high risk (Beck et al. [2006]). Applying the new formula to an individual Swiss health insurer, its payments into the RA scheme are postdicted to explode, reaching up to 13 percent of premium income. Its mistake had been to successfully implement Managed Care, resulting in low rates of hospitalization. The predicted risk management response is to extend hospital stays beyond three days, contrary to stated policy objectives also of the United States. |
|
Michael Wolf, Dan Wunderli, Fund-of-Funds Construction by Statistical Multiple Testing Methods, In: Working paper series / Institute for Empirical Research in Economics, No. No. 445, 2009. (Working Paper)
 
Fund-of-funds (FoF) managers face the task of selecting a (relatively) small number of hedge funds from a large universe of candidate funds. We analyse whether such a selection can be successfully achieved by looking at the track records of the available funds alone, using advanced statistical techniques. In particular, at a given point in time, we determine which funds significantly outperform a given benchmark while, crucially, accouting for the fact that a large number of funds are examined at the same time. This is achieved by employing so-called multiple testing methods. Then, the equal-weighted or the global minimum variance portfolio of the outperforming funds is held for one year, after which the selection process is repeated. When backtesting this strategy on two particular hedge fund universes, we find that the resulting FoF portfolios have attractive return properties compared to the 1/N portfolio (that is, simply equal-weighting all the available funds) but also when compared to two investable hedge fund indices. |
|
Jacob Goeree, Charles A Holt, Hierarchical Package Bidding: A Paper & Pencil Combinatorial Auction, In: Working paper series / Institute for Empirical Research in Economics, No. No. 436, 2009. (Working Paper)
 
We introduce a new combinatorial auction format based on a simple, transparent pricing mechanism tailored for the hierarchical package structure proposed by Rothkopf, Pekec, and Harstad (1998) to avoid computational complexity. This combination provides the feedback necessary for bidders in multi-round auctions to discern winning bidding strategies for subsequent rounds and to coordinate responses to aggressive package bids. The resulting mechanism is compared to two leading alternatives in a series of laboratory experiments involving varying degrees of value synergies. Based on these 'wind tunnel' tests the FCC has decided to use hierarchical package bidding in the major upcoming 700MHz auction. |
|
Pavlo R Blavatskyy, How to Extend a Model of Probabilistic Choice from Binary Choices to Choices among More Than Two Alternatives, In: Working paper series / Institute for Empirical Research in Economics, No. No. 426, 2009. (Working Paper)
 
This note presents an algorithm that extends a binary choice model to choice among multiple alternatives. Both neoclassical microeconomic theory and Luce choice model are consistent with the proposed algorithm. The algorithm is compatible withnseveral empirical findings (asymmetric dominance and attraction effects) that cannot be explained within standard models. |
|
Joseph P Romano, Azeem M Shaikh, Michael Wolf, Hypothesis Testing in Econometrics, In: Working paper series / Institute for Empirical Research in Economics, No. No. 444, 2009. (Working Paper)
 
This paper reviews important concepts and methods that are useful for hypothesis testing. First, we discuss the Neyman-Pearson framework. Various approaches to optimality are presented, including finite-sample and large-sample optimality. Then, some of the most important methods are summarized, as well as resampling methodology which is useful to set critical values. Finally, we consider the problem of multiple testing, which has witnessed a burgeoning literature in recent years. Along the way, we incorporate some examples that are current in the econometrics literature. While we include many problems with wellknown successful solutions, we also include open problems that are not easily handled with current technology, stemming from issues like lack of optimality or poor asymptotic approximations. |
|
Jacob Goeree, Arno Riedl, Aljaz Ule, In Search of Stars: Network Formation among Heterogeneous Agents, In: Working paper series / Institute for Empirical Research in Economics, No. No. 435, 2009. (Working Paper)
 
This paper reports results from a laboratory experiment on network formation among heterogeneous agents. The experimental design extends the Bala-Goyal (2000) model of network formation with decay and two-way flow of benefits by introducing agents with lower linking costs or higher benefits to others. Furthermore, agents' types may be common knowledge or private information. In all treatments, the (efficient) equilibrium network has a 'star' structure. While equilibrium predictions fail completely with homogeneous agents, star networks frequently occur with heterogeneous agents. Stars are not born but rather develop: with a high-value agent, the network's centrality, stability, and efficiency all increase over time. A structural econometric model based on best response dynamics and other-regarding preferences is used to analyze individual linking behavior. Maximum-likelihood estimates of the underlying structural parameters, obtained by pooling data from several treatments,nallow us to explain the main treatment effects. |
|
Aleksander Berentsen, Guido Menzio, Randall Wright, Inflation and Unemployment in the Long Run, In: Working paper series / Institute for Empirical Research in Economics, No. No. 442, 2009. (Working Paper)
 
We study the long-run relation between money, measured by inflation or interest rates, and unemployment. We first document in the data a positive relation between these variables at low frequencies. We then develop a framework where unemployment and money are both modeled using microfoundations based on search and bargaining theory, providing a unified theory for analyzing labor and goods markets. The calibrated model shows that money can account for a sizable fraction of trends in unemployment. We argue it matters, qualitativelynand quantitatively, whether one uses monetary theory based on search and bargaining, or an alternative ad hoc specification. |
|
Aleksander Berentsen, Mariana Rojas Breu, Shouyong Shi, Liquidity, Innovation and Growth, In: Working paper series / Institute for Empirical Research in Economics, No. No. 441, 2009. (Working Paper)
 
Many countries simultaneously suffer from high rates of inflation, low growth rates of per capita income and poorly developed financial sectors. In this paper, we integrate a microfounded model of money and finance into a model of endogenous growth to examine the effects of inflation and financial development. A novel feature of the model is that the market for innovation goods is decentralized. Financial intermediaries arise endogenously to provide liquid funds to the innovation sector.nWe calibrate the model to address two quantitative issues. One is the effects of annexogenous improvement in the productivity of the financial sector on welfare and perncapita growth. The other is the effects of inflation on welfare and growth. Consistent with the data but in contrast to previous work, reducing inflation generates largengains in the growth rate of per capita income as well as in welfare. Relative to reducing inflation, improving the efficiency of the financial market increases growth and welfare by much smaller amounts. |
|