Wen Denoth-Xu, The Financial Risk Management in Corporate Treasury of a Swiss Multinational Industry Group, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2009. (Bachelor's Thesis)
|
|
Philipp Langenegger, Analyse der Performance im Private Banking Geschäft der Jahre 2007 - 2008, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2009. (Bachelor's Thesis)
|
|
Patrick Miserez, Vergleich Total Returns bei ETFs und Tracker-Zertifikaten, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2009. (Bachelor's Thesis)
|
|
Franscella Oliver, Einflussfaktoren auf die strategische Asset Allocation von Pensionskassen, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2009. (Master's Thesis)
|
|
Hungerbuehler Philippe, Analysis of a momentum strategy to control the equity exposure, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2009. (Master's Thesis)
|
|
Ralf Vetsch, Fixed Income Arbitrage, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2009. (Bachelor's Thesis)
|
|
Rahel Schneider, Valuation of CO2 Option Contracts in the European Emission Trading Scheme, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2009. (Master's Thesis)
|
|
Thorsten Hens, Keine Panik in Krisenzeiten, In: Global Investor, p. 27 - 30, 1 September 2009. (Newspaper Article)
|
|
George Loewenstein, Don A Moore, Roberto A. Weber, Misperceiving the value of information in predicting the performance of others, Experimental Economics, Vol. 9 (3), 2009. (Journal Article)
Economic models typically allow for “free disposal” or “reversibility” of information, which implies non-negative value. Building on previous research on the “curse of knowledge” we explore situations where this might not be so. In three experiments, we document situations in which participants place positive value on information in attempting to predict the performance of uninformed others, even when acquiring that information diminishes their earnings. In the first experiment, a majority of participants choose to hire informed—rather than uninformed—agents, leading to lower earnings. In the second experiment, a significant number of participants pay for information—the solution to a puzzle—that hurts their ability to predict how many others will solve the puzzle. In the third experiment, we find that the effect is reduced with experience and feedback on the actual performance to be predicted. We discuss implications of our results for the role of information and informed decision making in economic situations. |
|
Reinhard Stoiber, Martin Glinz, Modeling and managing tacit product line requirements knowledge, In: 2nd International Workshop on Managing Requirements Engineering Knowledge (MaRK'09), 2009-09-01. (Conference or Workshop Paper published in Proceedings)
The success of very large product lines systems with
globally distributed stakeholders often builds significantly on the implicit knowledge of individuals. Final products are typically built by integrating numerous detailed specifications of subsystems. But how exactly all these parts can and need to be integrated to build valid end products is often left unspecified and to numerous discussions, reviews and the expertise of senior architects and product managers.
Building a high-level product line requirements model that explicitly and formally specifies common and variable requirements, their precise integration semantics and the constraints for selecting variable features helps significantly to manage this crucial and often tacit requirements knowledge. Based on an industrial exemplar we motivate and demonstrate such an approach and discuss our early findings regarding knowledge and rationale management in product line requirements engineering. |
|
Jörg-Uwe Kietz, Floarea Serban, Abraham Bernstein, S Fischer, Towards cooperative planning of data mining workflows, In: Proc of the ECML/PKDD09 Workshop on Third Generation Data Mining: Towards Service-oriented Knowledge Discovery (SoKD-09), 2009-09. (Conference or Workshop Paper published in Proceedings)
A major challenge for third generation data mining and knowledge discovery systems is the integration of different data mining tools and services for data understanding, data integration, data preprocessing, data mining, evaluation and deployment, which are distributed across the network of computer systems. In this paper we outline how an intelligent assistant that is intended to support end-users in the difficult and time consuming task of designing KDD-Workflows out of these distributed services can be built. The assistant should support the user in checking the correctness of workflows, understanding the goals behind given workflows, enumeration of AI planner generated workflow completions, storage, retrieval, adaptation and repair of previous workflows. It should also be an open easy extendable system. This is reached by basing
the system on a data mining ontology (DMO) in which all the services (operators) together with their in-/output, pre-/postconditions are described. This description is compatible with OWL-S and new operators can be added importing their OWL-S specification and classifying it into
the operator ontology. |
|
C Jaag, U Trinkner, Bottleneck regulation in telecommunications, railways and post, Network Industries Quarterly, Vol. 11 (3), 2009. (Journal Article)
Various regulatory remedies deal with monopolistic bottlenecks in network industries. In telecommunications,
railways and post, the European regulatory provisions seem appropriate. |
|
S Amstutz, Früh übt sich, was ein Meister werden will, HR Today: das Schweizer Human Resource Management-Journal, Vol. 9, 2009. (Journal Article)
|
|
T F Ruud, P Friebe, Daniela Schmitz, Internationales Rahmenwerk der beruflichen Praxis des Internen Audits: Überblick über die überarbeiteten Bestimmungen des Institue of Internal Auditors, Der Schweizer Treuhänder, Vol. 83 (9), 2009. (Journal Article)
Das Institute of Internal Auditors (IIA) hat das Rahmenwerk der beruflichen Praxis des internen Audits überarbeitet. Die neuen Bestimmungen sind am 1. Januar 2009 in Kraft getreten. Obschon das überarbeitete Rahmenwerk weitgehend dem alten entspricht, enthält es einige wesentliche Neuerungen, die im Beitrag aufgezeigt und erläutert werden. |
|
Rolf Heusser, Laura Beccari, Andrea Schenker-Wicki, The Swiss external QA system: lessons learned over the past five years, In: EUA Bologna Handbook, Dr. Josef Raabe Verlags GmbH, Berlin, p. B 4.6 - 3, 2009-09. (Book Chapter)
The external quality assurance system in Switzerland focuses on institutional assessments. Periodic assessments of the internal quality assurance systems of the Swiss universities are mandatory and linked to the financing of the institutions. The OAQ successfully carried out the first cycle of such audits in 2003 – 4 which it then repeated in 2007 – 8. These audits have been well accepted by the institutions and were perceived as being a contribution to institutional quality enhancement. Changes at universities following the first audit have been noticed and this clearly shows the success of an institutional approach to external QA. Bologna, and particularly the ESG, served to accelerate internal developments in Switzerland. Project- based international partnerships proved to be an essential ingredient to improve the agency’s daily work, while international memberships in relevant networks increased the agency’s level of accountability and credibility with positive national spill-over effects. |
|
Bruno Staffelbach, Gudela Grote, Von der Mitarbeiter- zur Vorgesetztenbeurteilung?, In: Führen durch Vorbild – Persönlichkeiten im Gespräch, Stier Communications, Weiningen, p. 119 - 122, 2009-09. (Book Chapter)
|
|
Ernst Fehr, Lorenz Goette, Christian Zehnder, A behavioral account of the labor market: the role of fairness concerns, Annual Review of Economics, Vol. 1 (1), 2009. (Journal Article)
In this paper, we argue that important labor market phenomena can be better understood if one takes (a) the inherent incompleteness and relational nature of most employment contracts and (b) the existence of reference-dependent fairness concerns among a substantial share of the population into account. Theory shows and experiments confirm that, even if fairness concerns were to exert only weak effects in one-shot interactions, repeated interactions greatly magnify the relevance of such concerns on economic outcomes. We also review evidence from laboratory and field experiments examining the role of wages and fairness on effort, derive predictions from our approach for entry-level wages and incumbent workers' wages, confront these predictions with the evidence, and show that reference-dependent fairness concerns may have important consequences for the effects of economic policies such as minimum wage laws. |
|
D Schunk, Behavioral heterogeneity in dynamic search situations : Theory and experimental evidence, Journal of Economic Dynamics and Control, Vol. 33 (9), 2009. (Journal Article)
This paper presents models for search behavior and provides experimental evidence that behavioral heterogeneity in search is linked to heterogeneity in individual preferences. Observed search behavior is more consistent with a new model that assumes dynamic updating of utility reference points than with models that are based on expected-utility maximization. Specifically, reference point updating and loss aversion play a role for more than a third of the population. The findings are of practical relevance as well as of interest for researchers who incorporate behavioral heterogeneity into models of dynamic choice behavior in, for example, consumer economics, labor economics, finance, and decision theory. |
|
Pavlo R Blavatskyy, Wolfgang R Köhler, Range effects and lottery pricing, Experimental Economics, Vol. 12 (3), 2009. (Journal Article)
A standard method to elicit certainty equivalents is the Becker-DeGroot-Marschak (BDM) procedure. We compare the standard BDM procedure and a BDM procedure with a restricted range of minimum selling prices that an individual can state. We find that elicited prices are systematically affected by the range of feasible minimum selling prices. Expected utility theory cannot explain these results. Nonexpected utility theories can only explain the results if subjects consider compound lotteries generated by the BDM procedure. We present an alternative explanation where subjects sequentially compare the lottery to monetary amounts in order to determine their minimum selling price. The model offers a formal explanation for range effects and for the underweighting of small and the overweighting of large probabilities. |
|
Marcus Hagedorn, The value of information for auctioneers, Journal of Economic Theory, Vol. 144 (5), 2009. (Journal Article)
An auctioneer wants to sell an indivisible object to one of multiple bidders, who have private information about their valuations of the object. A bidder's information structure determines the accuracy with which the bidder knows her private valuation. The main result of the paper is that the auctioneer's revenue is a convex function of bidders' information structures. One implication is that assigning asymmetric information structures instead of symmetric information structures to bidders is always revenue-enhancing. This paper generalizes a result of Bergemann and Pesendorfer [3], who show that revenue-maximizing information structures are asymmetric. |
|