Tobias Straumann, Ulrich Woitek, A pioneer of a new monetary policy? Sweden's price-level targeting of the 1930s revisited, European Review of Economic History, Vol. 13 (02), 2009. (Journal Article)
The paper re-examines Sweden’s price level targeting during the 1930s which is regarded as a precursor of today’s inflation targeting. According to conventional wisdom, the Riksbank was the first central bank to adopt price level targeting, although in practice giving priority to exchange rate stabilisation. Based on Bayesian econometric techniques and the evaluation of new archival sources, we come to the conclusion that defending a fixed exchange rate is hard to reconcile with the claim of adopting price level targeting. This finding has implications for the prevailing view of the 1930s as a decade of great policy innovations. |
|
Bruno Frey, Susanne Neckermann, Abundant but neglected: awards as incentives, The Economists' Voice, Vol. 6 (2), 2009. (Journal Article)
Economists traditionally focus on monetary compensation when examining incentives, but awards are of immense practical relevance as can be inferred from their prevalence in the form of state orders, decorations and prizes, according to Bruno Frey and Susanne Neckermann. |
|
Ernst Fehr, Christian Zehnder, Altruism, In: The Oxford companion to emotion and the affective sciences, Oxford University Press, Oxford, p. 24 - 26, 2009. (Book Chapter)
|
|
Christoph Winter, Altruism, education and inequality in the United States, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2009. (Dissertation)
This thesis contains several lines of research conducted during my four years at the European University Institute. It consists of three chapters that analyze the link between parental inter-vivos transfers, education and inequality. In the first chapter, "Accounting for the Changing Role of Family Income in Determining College Entry", I present a computable dynamic general equilibrium model with overlapping generations and incomplete markets and I use this model to measure the fraction of households constrained in their college entry decision. College education is financed by family transfers and public subsidies, where transfers are generated through altruism on part of the parents. Parents face a trade-off between making transfers to their children and own savings. Ceteris paribus, parents who expect lower future earnings transfer less and save more. Data from the 1986 Survey of Consumer Finances give support to this mechanism. I show that this trade-off leads to substantially higher estimates of the fraction of constrained households compared to the results in the empirical literature (18 instead of 8 percent). The model also predicts that an increment in parents' earnings uncertainty decreases their willingness to provide transfers. In combination with rising returns to education, which makes college going more attractive, this boosts the number of constrained youths and explains why family income has become more important for college access over the last decades in the U.S. economy. In chapter two, titled "Why Do the Rich Save More: The Role of Housing", I analyze the determinants of the wealth-income gradient with educational attainment that is observable in the data. This gradient is very steep: using the 1989 wave of the Survey of Consumer Finances (SCF), I find the median college graduate near retirement age holds twice as much wealth as the median high school dropout. In this paper, I argue that housing plays an important role for explaining the wealth-income gradient that is observable in the data. The last chapter, "Parental Transfers and Parental Income: Does the Future Matter More Than the Present?" adds to the results that were derived in the first chapter. More precisely, I present a model of parental transfers that is based on the assumption of one sided altruism. I use this model to analytically study the link between parental expectations about their future resources and their present transfer behavior. In the context of my model, I show that parents with brighter earnings prospects are willing to transfer more to their o spring already today, all other things equal. |
|
Rainer Winkelmann, Stefan Boes, Analysis of microdata, Springer, Berlin, 2009. (Book/Research Monograph)
The book provides a simple, intuitive introduction to regression models for qualitative and discrete dependent variables, to sample selection models, and to event history models, all in the context of maximum likelihood estimation. It presents a wide range of commonly used models. The book thereby enables the reader to become a critical consumer of current empirical social science research and to conduct own empirical analyses. The book includes numerous examples, illustrations, and exercises. It can be used as a textbook for an advanced undergraduate, a Master`s or a first-year Ph.D. course in microdata analysis, and as a reference for practitioners and researchers. |
|
T D Griffiths, S Kumar, K von Kriegstein, T Overath, Klaas Enno Stephan, K J Friston, Auditory object analysis, In: The Cognitive Neurosciences, MIT Press, Cambridge, MA, p. 367 - 382, 2009. (Book Chapter)
|
|
Bruno Frey, Susanne Neckermann, Awards: a view from economics, In: The economics of ethics and the ethics of economics, Edward Elgar, Cheltenham, UK, p. 73 - 88, 2009. (Book Chapter)
|
|
Dirk Sven Björn Drechsel, Banks and the Swiss economy, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2009. (Dissertation)
|
|
Klaas Enno Stephan, W D Penny, Jean Daunizeau, R J Moran, K J Friston, Bayesian model selection for group studies, NeuroImage, Vol. 46 (4), 2009. (Journal Article)
Bayesian model selection (BMS) is a powerful method for determining the most likely among a set of competing hypotheses about the mechanisms that generated observed data. BMS has recently found widespread application in neuroimaging, particularly in the context of dynamic causal modelling (DCM). However, so far, combining BMS results from several subjects has relied on simple (fixed effects) metrics, e.g. the group Bayes factor (GBF), that do not account for group heterogeneity or outliers. In this paper, we compare the GBF with two random effects methods for BMS at the between-subject or group level. These methods provide inference on model-space using a classical and Bayesian perspective respectively. First, a classical (frequentist) approach uses the log model evidence as a subject-specific summary statistic. This enables one to use analysis of variance to test for differences in log-evidences over models, relative to inter-subject differences. We then consider the same problem in Bayesian terms and describe a novel hierarchical model, which is optimised to furnish a probability density on the models themselves. This new variational Bayes method rests on treating the model as a random variable and estimating the parameters of a Dirichlet distribution which describes the probabilities for all models considered. These probabilities then define a multinomial distribution over model space, allowing one to compute how likely it is that a specific model generated the data of a randomly chosen subject as well as the exceedance probability of one model being more likely than any other model. Using empirical and synthetic data, we show that optimising a conditional density of the model probabilities, given the log-evidences for each model over subjects, is more informative and appropriate than both the GBF and frequentist tests of the log-evidences. In particular, we found that the hierarchical Bayesian approach is considerably more robust than either of the other approaches in the presence of outliers. We expect that this new random effects method will prove useful for a wide range of group studies, not only in the context of DCM, but also for other modelling endeavours, e.g. comparing different source reconstruction methods for EEG/MEG or selecting among competing computational models of learning and decision-making. |
|
Pavlo R Blavatskyy, Betting on own knowledge: Experimental test of overconfidence, Journal of Risk and Uncertainty, Vol. 38 (1), 2009. (Journal Article)
This paper presents a new incentive compatible method for measuring confidence in own knowledge. This method consists of two parts. First, an individual answers several general knowledge questions. Second, the individual chooses among three alternatives: (1) one question is selected at random and the individual receives a payoff if he or she has answered this question correctly; (2) the individual receives the same payoff with a probability equal to the percentage of correctly answered questions; (3) either the first or the second alternative is selected. The choice of the first (second) alternative reveals overconfidence (underconfidence). The individual is well calibrated if he or she chooses the third alternative. Experimental results show that subjects, on average, exhibit underconfidence about their own knowledge when the incentive compatible mechanism is used. Their confidence in own knowledge does not depend on their attitude towards risk/ambiguity. |
|
P Collier, A Hoeffler, D Rohner, Beyond greed and grievance: feasibility and civil war, Oxford Economic Papers, Vol. 61 (1), 2009. (Journal Article)
A key distinction among theories of civil war is between those that are built upon motivation and those that are built upon feasibility. We analyze a comprehensive global sample of civil wars for the period 1965-2004 and subject the results to a range of robustness tests. The data constitute a substantial advance on previous work. We find that variables that are close proxies for feasibility have powerful consequences for the risk of a civil war. Our results substantiate the 'feasibility hypothesis' that where civil war is feasible it will occur without reference to motivation. |
|
Julia Casutt-Schneeberger, Business cycles and strike activity in Austria, Germany and Switzerland, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2009. (Dissertation)
|
|
T M Schofield, P Iverson, S J Kiebel, Klaas Enno Stephan, J M Kilner, K J Friston, J T Crinion, C J Price, A P Leff, Changing meaning causes coupling changes within higher levels of the cortical hierarchy, Proceedings of the National Academy of Sciences of the United States of America (PNAS), Vol. 106 (28), 2009. (Journal Article)
Processing of speech and nonspeech sounds occurs bilaterally within primary auditory cortex and surrounding regions of the superior temporal gyrus; however, the manner in which these regions interact during speech and nonspeech processing is not well understood. Here, we investigate the underlying neuronal architecture of the auditory system with magnetoencephalography and a mismatch paradigm. We used a spoken word as a repeating "standard" and periodically introduced 3 "oddball" stimuli that differed in the frequency spectrum of the word's vowel. The closest deviant was perceived as the same vowel as the standard, whereas the other 2 deviants were perceived as belonging to different vowel categories. The neuronal responses to these vowel stimuli were compared with responses elicited by perceptually matched tone stimuli under the same paradigm. For both speech and tones, deviant stimuli induced coupling changes within the same bilateral temporal lobe system. However, vowel oddball effects increased coupling within the left posterior superior temporal gyrus, whereas perceptually equivalent nonspeech oddball effects increased coupling within the right primary auditory cortex. Thus, we show a dissociation in neuronal interactions, occurring at both different hierarchal levels of the auditory system (superior temporal versus primary auditory cortex) and in different hemispheres (left versus right). This hierarchical specificity depends on whether auditory stimuli are embedded in a perceptual context (i.e., a word). Furthermore, our lateralization results suggest left hemisphere specificity for the processing of phonological stimuli, regardless of their elemental (i.e., spectrotemporal) characteristics. |
|
Matthias Doepke, Fabrizio Zilibotti, Child labour: is international activism the solution or the problem?, Vox, 2009. (Journal Article)
Through actions like product boycotts or imposing international labour standards, governments and consumer groups in rich countries put pressure on poor countries to discourage the use of child labour. But the child-labour problem in developing countries shows no sign of abating. Our research suggests that international activism may be partially to blame, because it can thwart regulation of child labour within developing countries. |
|
Dallas Burtraw, Jacob Goeree, Charles A Holt, Erica Myers, Karen L Palmer, William Shobe, Collusion in auctions for emission permits: An experimental analysis, Journal of Policy Analysis and Management, Vol. 28 (4), 2009. (Journal Article)
Environmental markets have several institutional features that provide a new context for the use of auctions and that have not been studied previously. This paper reports on laboratory experiments testing three auction forms — niform and discriminatory price sealed-bid auctions and an ascending clock auction. We test the ability of subjects to tacitly or explicitly collude in order to maximize profits. Our main result is that the discriminatory and uniform price auctions produce greater revenues than the clock
auction, both with and without explicit communication. The clock appears to facilitate successful collusion, both because of its sequential structure and because it allows bidders to focus on one dimension of cooperation (quantity) rather than two (price and quantity). |
|
Christian Ruff, J Driver, S Bestmann, Combining TMS and fMRI: From ‘virtual lesions’ to functional-network accounts of cognition, Cortex, Vol. 45 (9), 2009. (Journal Article)
|
|
J N Tegnér, A Compte, C Auffray, G An, G Cedersund, G Clermont, B Gutkin, Z Oltvai, Klaas Enno Stephan, R Thomas, P Villoslada, Computational disease modeling - fact or fiction?, BMC Systems Biology, Vol. 3, 2009. (Journal Article)
BACKGROUND: Biomedical research is changing due to the rapid accumulation of experimental data at an unprecedented scale, revealing increasing degrees of complexity of biological processes. Life Sciences are facing a transition from a descriptive to a mechanistic approach that reveals principles of cells, cellular networks, organs, and their interactions across several spatial and temporal scales. There are two conceptual traditions in biological computational-modeling. The bottom-up approach emphasizes complex intracellular molecular models and is well represented within the systems biology community. On the other hand, the physics-inspired top-down modeling strategy identifies and selects features of (presumably) essential relevance to the phenomena of interest and combines available data in models of modest complexity. RESULTS: The workshop, "ESF Exploratory Workshop on Computational disease Modeling", examined the challenges that computational modeling faces in contributing to the understanding and treatment of complex multi-factorial diseases. Participants at the meeting agreed on two general conclusions. First, we identified the critical importance of developing analytical tools for dealing with model and parameter uncertainty. Second, the development of predictive hierarchical models spanning several scales beyond intracellular molecular networks was identified as a major objective. This contrasts with the current focus within the systems biology community on complex molecular modeling. CONCLUSION: During the workshop it became obvious that diverse scientific modeling cultures (from computational neuroscience, theory, data-driven machine-learning approaches, agent-based modeling, network modeling and stochastic-molecular simulations) would benefit from intense cross-talk on shared theoretical issues in order to make progress on clinically relevant problems. |
|
Ernst Fehr, Oliver Hart, Christian Zehnder, Contracts, reference points, and competition—behavioral effects of the fundamental transformation, Journal of the European Economic Association, Vol. 7 (2-3), 2009. (Journal Article)
In this paper we study the role of incomplete ex ante contracts for ex post trade. Previous experimental evidence indicates that a contract provides a reference point for entitlements when the terms are negotiated in a competitive market. We show that this finding no longer holds when the terms are determined in a non-competitive way. Our results imply that the presence of a “fundamental transformation” (i.e., the transition from a competitive market to a bilateral relationship) is important for a contract to become a reference point. To the best of our knowledge this behavioral aspect of the fundamental transformation has not been shown before. |
|
Ganna Pogrebna, Pavlo R Blavatskyy, Coordination, focal points and voting in strategic situations: a natural experiment, Public Choice, Vol. 140 (1-2), 2009. (Journal Article)
This paper studies coordination in a multi-stage elimination tournament with large monetary incentives and a diversified subject pool drawn from the adult British population. In the tournament, members of an ad hoc team earn money by answering general knowledge questions and then eliminate one contestant by plurality voting without prior communication. We find that in the early rounds of the tournament, contestants use a focal principle and coordinate on one of the multiple Nash equilibria in pure strategies by eliminating the weakest member of the team. However, in the later rounds, contestants switch to playing a mixed strategy Nash equilibrium. |
|
Peter Zweifel, Harry Telser, Cost–benefit analysis for health, In: Handbook of research on cost–benefit analysis, Edward Elgar, Cheltenham, p. 31 - 54, 2009. (Book Chapter)
|
|