P Moog, Uschi Backes-Gellner, The impact of labour market regulations on (potential) entrepreneurs: the case of Germany, International Journal of Entrepreneurship and Innovation Management, Vol. 10 (1), 2009. (Journal Article)
This paper explores the impact of German labour market regulations on the willingness of labour market entrants to start their own business. We study the legal situation, the actual and the perceived constraints imposed on businesses as well as the actual flexibility available of start-ups. We find strong evidence that labour market regulations are often misperceived in Germany. Furthermore, these misperceptions distort the willingness to become selfemployed. Start-ups are de jure hardly affected by labour market regulations because of a large number of exemptions. They are able to use a number of flexibility measures and thereby avoid regulatory restrictions. However, perceptions are quite the opposite, particularly in legal areas with high media coverage. This leads to a strong reluctance of labour market entrants to consider a start-up. Thus, measures aiming at increasing entrepreneurship should place strong emphasis on accurate knowledge of regulatory constraints and particularly in Germany, on less sceptical media coverage. |
|
Stephan Nüesch, Egon Franck, The role of patriotism in explaining the TV audience of national team games - Evidence from four international tournaments, Journal of Media Economics, Vol. 22 (1), 2009. (Journal Article)
In the literature addressing the determinants of TV audiences in sports, both the absolute and relative playing strength of the opponents play a prominent role. Regarding national team competitions, however, this study conjectures that patriotism matters as well. Analyzing the Swiss TV audience at 2 World Cups and 2 European Football Championships, this study finds strong evidence that TV ratings are highly affected by the sizes of the groups of foreign residents affiliated with the teams playing on the field. |
|
Conrad Meyer, Umsatzerfassung als Instrument des Earnings Management: eine zentrale Problemstellung aussagekräftiger Abschlüsse, In: Finanz- und Rechnungswesen: Jahrbuch 2009, WEKA Business Media, Zürich, p. 57 - 73, 2009. (Book Chapter)
Der Jahresabschluss hat sich auf Grund des steigenden Interesses der Öffentlichkeit zu einem immer stärker beachteten Informationsmedium entwickelt. Damit verbunden ist die Versuchung der Unternehmen, die Jahresabschlüsse aktiv zu gestalten. Kritisiert wird in diesem Zusammenhang
insbesondere die bewusste Beeinflussung der präsentierten Ergebnisse, das so genannte Earnings Management. Die Aufmerksamkeit richtet sich dabei vor allem auf die Erfassung der Erträge. Eine langjährige Untersuchung2 zeigt, dass die Hälfte aller Bilanzkrisen börsenkotierter US-Konzerne durch Earnings Management verursacht worden sind. Die amerikanische Börsenaufsicht SEC ermittelt gar in
55% aller Deliktsverfahren aufgrund vermuteter verfrühter oder überhöhter Umsatzerfassung.
Lange Zeit galt die Erfassung von Umsätzen als unproblematisch und weitgehend frei von Gestaltungsspielräumen. Prominente medienwirksame Bilanzskandale, wie beispielsweise Worldcom, unterstreichen, dass dies Geschichte ist. Die bewusste Manipulation von Umsätzen birgt nicht nur für die
betroffenen Unternehmen selbst, sondern auch für Investoren, Prüfgesellschaften und Mitarbeitende ein
enormes Schadenpotenzial. Es stellt sich deshalb die Frage, warum die Manipulation der Ergebnisse durch eine problematische Erfassung der Umsätze einen so hohen Anklang findet. Der Beitrag soll mögliche Antworten geben und allfällige Konsequenzen aufzeigen. Dabei werden vor allem Beispiele grosser Unternehmen genannt. Die Problematik betrifft aber mit Sicherheit auch mittelgrosse und kleine
Unternehmen. Nur stehen diesbezüglich leider keine empirischen Informationen zur Verfügung. |
|
Helmut Max Dietl, S Royer, U Stratmann, Value creation architectures and competitive advantage: lessons from the European automobile industry, California Management Review, Vol. 51 (3), 2009. (Journal Article)
|
|
Egon Franck, Helmut Max Dietl, Urs Meister, Wie die Grundversorgung mit dem Wettbewerb zusammenhängt, Die Volkswirtschaft, Vol. 82 (9), 2009. (Journal Article)
Mit der Liberalisierung des Postmarktes
geht nicht zuletzt die Befürchtung
einher, dass minimale
Standards des Service public nicht
mehr erbracht werden. Ein zentrales
Element der vom Bundesrat
vorgelegten neuen Postgesetzgebung
ist daher die Beauftragung
der Post mit der Grundversorgung.
In der praktischen Umsetzung ergeben
sich jedoch Schwierigkeiten:
Definition, Umfang und
die Art der Finanzierung des Auftrages
beeinflussen wesentlich
die Funktionsfähigkeit des Wettbewerbs. |
|
Andrea Schenker-Wicki, Maria Olivares, Wie haben die Schweizer Universitäten die Hochschulreformen der letzten zehn Jahre gemeistert?, Die Volkswirtschaft (9), 2009. (Journal Article)
Im vorliegenden Beitrag wird mittels einer Data Envelopment Analysis (DEA) untersucht, wie sich die Effizienz der einzelnen Universitäten in den Jahren 1999-2007 entwickelt hat und welche Universitäten im Vergleich zu anderen Ineffizienzen aufweisen. Die meisten Schweizer Universitäten haben - nach gewissen Anfangsschwierigkeiten, welche sich in Effizienzrückgängen niederschlugen - die anspruchsvollen Reformprozesse der letzten zehn Jahre gut bewältigt. Hochschulen, die zusätzlich interne Restrukturierungen zu bewältigen hatten, mussten höhere Effizienzeinbrüche hinnehmen und brauchten länger, um den Anpassungsprozess erfolgreich abschliessen zu können. |
|
Margit Osterloh, Michael Zollinger, Wir brauchen Fixlöhne und Gewinnbeteiligungen, SKO-Leader, Vol. 2009 (4), 2009. (Journal Article)
Die Boni-Zahlungen waren nicht nur für den Finanzsektor schädlich, findet Margrit Osterloh von der Uni Zürich und spricht sich generell dagegen aus. Im Interview mit SKO-Leader sagt die renommierte Wirtschaftsprofessorin, warum sie diese Meinung vertritt und welche Lohnsysteme sie besser findet. |
|
Jim Malley, Apostolis Philippopoulos, Ulrich Woitek, To react or not? Technology shocks, fiscal policy and welfare in the EU-3, European Economic Review, Vol. 53 (6), 2009. (Journal Article)
This paper develops a DSGE model to examine the quantitative macroeconomic implications of counter-cyclical fiscal policy for France, Germany and the UK. The model incorporates real wage rigidity and consumption habits, as the particular market failures justifying policy intervention. We subject the model to productivity shocks and allow policy instruments to react to the output gap and the debt-to-output ratio. A welfare analysis reveals that the most effective instrument-target combination is to use public consumption to stabilize the output gap. Moreover, welfare gains from counter-cyclical fiscal policy are much stronger in the presence of wage rigidities compared with consumption habits. Finally, since active policy and automatic stabilizers are substitutes, it is possible that relatively undistorted economies may be in need of countercyclical fiscal action due to inadequate automatic stabilizers. |
|
G Hein, A Alink, A Kleinschmidt, N G Müller, The attentional blink modulates activity in the early visual cortex, Journal of Cognitive Neuroscience, Vol. 21 (1), 2009. (Journal Article)
The attentional blink (AB) documents a particularly strong case of visual attentional competition, in which subjects' ability to identify a second target (T2) is significantly impaired when it is presented with a short SOA after a first target (T1). We used functional magnetic resonance imaging to investigate the impact of the AB on visual activity in individually defined retinotopic representations of the target stimuli. Our results show reduction of neural response in V3 and marginally in V2 and V1, paralleling the behavioral AB effect. Reduction of visual activity was accompanied by reduced neural response in the inferior parietal cortex. This indicates that attentional competition modulates activity in higher-order parietal regions and the early visual cortex, providing a plausible neural basis of the behavioral AB effect. |
|
Björn Bartling, Andreas Park, What determines the level of IPO gross spreads? Underwriter profits and the cost of going public, International Review of Economics and Finance, Vol. 18 (1), 2009. (Journal Article)
This paper addresses three empirical findings of the literature on initial public offerings. (i) Why do investment banks earn positive profits in a competitive market? (ii) Why do banks receive lower gross spreads in venture capitalist (VC) backed than in non-VC backed IPOs? (iii) Why is underpricing more pronounced in VC than in
non-VC backed IPOs? While each phenomenon can be explained by itself, there is no explanation yet why all three occur simultaneously. We propose an integrated theoretical framework to address this issue. The IPO procedure is modeled as a two-stage signaling game: In the second stage banks set offer prices given their private information and the level of the spread. Issuing firms anticipate their bank’s pricing decision and, in the first stage, set spreads to maximize expected revenue. Investors are aware of this process and subscribe only if their expected profits are non-negative. Firms’ equilibrium spreads are large so as to induce banks to set high prices, allowing
banks to make profits. Superiorly informed VC backed firms impose smaller spreads but face larger underpricing than non-VC backed firms. |
|
Tobias Straumann, Ulrich Woitek, A pioneer of a new monetary policy? Sweden's price-level targeting of the 1930s revisited, European Review of Economic History, Vol. 13 (02), 2009. (Journal Article)
The paper re-examines Sweden’s price level targeting during the 1930s which is regarded as a precursor of today’s inflation targeting. According to conventional wisdom, the Riksbank was the first central bank to adopt price level targeting, although in practice giving priority to exchange rate stabilisation. Based on Bayesian econometric techniques and the evaluation of new archival sources, we come to the conclusion that defending a fixed exchange rate is hard to reconcile with the claim of adopting price level targeting. This finding has implications for the prevailing view of the 1930s as a decade of great policy innovations. |
|
Bruno Frey, Susanne Neckermann, Abundant but neglected: awards as incentives, The Economists' Voice, Vol. 6 (2), 2009. (Journal Article)
Economists traditionally focus on monetary compensation when examining incentives, but awards are of immense practical relevance as can be inferred from their prevalence in the form of state orders, decorations and prizes, according to Bruno Frey and Susanne Neckermann. |
|
Ernst Fehr, Christian Zehnder, Altruism, In: The Oxford companion to emotion and the affective sciences, Oxford University Press, Oxford, p. 24 - 26, 2009. (Book Chapter)
|
|
Christoph Winter, Altruism, education and inequality in the United States, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2009. (Dissertation)
This thesis contains several lines of research conducted during my four years at the European University Institute. It consists of three chapters that analyze the link between parental inter-vivos transfers, education and inequality. In the first chapter, "Accounting for the Changing Role of Family Income in Determining College Entry", I present a computable dynamic general equilibrium model with overlapping generations and incomplete markets and I use this model to measure the fraction of households constrained in their college entry decision. College education is financed by family transfers and public subsidies, where transfers are generated through altruism on part of the parents. Parents face a trade-off between making transfers to their children and own savings. Ceteris paribus, parents who expect lower future earnings transfer less and save more. Data from the 1986 Survey of Consumer Finances give support to this mechanism. I show that this trade-off leads to substantially higher estimates of the fraction of constrained households compared to the results in the empirical literature (18 instead of 8 percent). The model also predicts that an increment in parents' earnings uncertainty decreases their willingness to provide transfers. In combination with rising returns to education, which makes college going more attractive, this boosts the number of constrained youths and explains why family income has become more important for college access over the last decades in the U.S. economy. In chapter two, titled "Why Do the Rich Save More: The Role of Housing", I analyze the determinants of the wealth-income gradient with educational attainment that is observable in the data. This gradient is very steep: using the 1989 wave of the Survey of Consumer Finances (SCF), I find the median college graduate near retirement age holds twice as much wealth as the median high school dropout. In this paper, I argue that housing plays an important role for explaining the wealth-income gradient that is observable in the data. The last chapter, "Parental Transfers and Parental Income: Does the Future Matter More Than the Present?" adds to the results that were derived in the first chapter. More precisely, I present a model of parental transfers that is based on the assumption of one sided altruism. I use this model to analytically study the link between parental expectations about their future resources and their present transfer behavior. In the context of my model, I show that parents with brighter earnings prospects are willing to transfer more to their o spring already today, all other things equal. |
|
Rainer Winkelmann, Stefan Boes, Analysis of microdata, Springer, Berlin, 2009. (Book/Research Monograph)
The book provides a simple, intuitive introduction to regression models for qualitative and discrete dependent variables, to sample selection models, and to event history models, all in the context of maximum likelihood estimation. It presents a wide range of commonly used models. The book thereby enables the reader to become a critical consumer of current empirical social science research and to conduct own empirical analyses. The book includes numerous examples, illustrations, and exercises. It can be used as a textbook for an advanced undergraduate, a Master`s or a first-year Ph.D. course in microdata analysis, and as a reference for practitioners and researchers. |
|
T D Griffiths, S Kumar, K von Kriegstein, T Overath, Klaas Enno Stephan, K J Friston, Auditory object analysis, In: The Cognitive Neurosciences, MIT Press, Cambridge, MA, p. 367 - 382, 2009. (Book Chapter)
|
|
Bruno Frey, Susanne Neckermann, Awards: a view from economics, In: The economics of ethics and the ethics of economics, Edward Elgar, Cheltenham, UK, p. 73 - 88, 2009. (Book Chapter)
|
|
Dirk Sven Björn Drechsel, Banks and the Swiss economy, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2009. (Dissertation)
|
|
Klaas Enno Stephan, W D Penny, Jean Daunizeau, R J Moran, K J Friston, Bayesian model selection for group studies, NeuroImage, Vol. 46 (4), 2009. (Journal Article)
Bayesian model selection (BMS) is a powerful method for determining the most likely among a set of competing hypotheses about the mechanisms that generated observed data. BMS has recently found widespread application in neuroimaging, particularly in the context of dynamic causal modelling (DCM). However, so far, combining BMS results from several subjects has relied on simple (fixed effects) metrics, e.g. the group Bayes factor (GBF), that do not account for group heterogeneity or outliers. In this paper, we compare the GBF with two random effects methods for BMS at the between-subject or group level. These methods provide inference on model-space using a classical and Bayesian perspective respectively. First, a classical (frequentist) approach uses the log model evidence as a subject-specific summary statistic. This enables one to use analysis of variance to test for differences in log-evidences over models, relative to inter-subject differences. We then consider the same problem in Bayesian terms and describe a novel hierarchical model, which is optimised to furnish a probability density on the models themselves. This new variational Bayes method rests on treating the model as a random variable and estimating the parameters of a Dirichlet distribution which describes the probabilities for all models considered. These probabilities then define a multinomial distribution over model space, allowing one to compute how likely it is that a specific model generated the data of a randomly chosen subject as well as the exceedance probability of one model being more likely than any other model. Using empirical and synthetic data, we show that optimising a conditional density of the model probabilities, given the log-evidences for each model over subjects, is more informative and appropriate than both the GBF and frequentist tests of the log-evidences. In particular, we found that the hierarchical Bayesian approach is considerably more robust than either of the other approaches in the presence of outliers. We expect that this new random effects method will prove useful for a wide range of group studies, not only in the context of DCM, but also for other modelling endeavours, e.g. comparing different source reconstruction methods for EEG/MEG or selecting among competing computational models of learning and decision-making. |
|
Pavlo R Blavatskyy, Betting on own knowledge: Experimental test of overconfidence, Journal of Risk and Uncertainty, Vol. 38 (1), 2009. (Journal Article)
This paper presents a new incentive compatible method for measuring confidence in own knowledge. This method consists of two parts. First, an individual answers several general knowledge questions. Second, the individual chooses among three alternatives: (1) one question is selected at random and the individual receives a payoff if he or she has answered this question correctly; (2) the individual receives the same payoff with a probability equal to the percentage of correctly answered questions; (3) either the first or the second alternative is selected. The choice of the first (second) alternative reveals overconfidence (underconfidence). The individual is well calibrated if he or she chooses the third alternative. Experimental results show that subjects, on average, exhibit underconfidence about their own knowledge when the incentive compatible mechanism is used. Their confidence in own knowledge does not depend on their attitude towards risk/ambiguity. |
|