Damianov Peter, Credit Default Swaps (CDSs) - Entstehung, Funktion und ihre Rolle in der Finanzkrise, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2010. (Bachelor's Thesis)
|
|
Beat Meier, Gibt es effiziente Fusionen?, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2010. (Bachelor's Thesis)
|
|
Nicolas Karl, Der Zusammenhang des Consumption-Based Capital Asset Pricing Model und der Faktormodelle aus der Perspektive eines Marktgleichgewichts, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2010. (Bachelor's Thesis)
|
|
Alexander Tobler, The Impact of Property Investments for Insitutional Investors, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2010. (Bachelor's Thesis)
|
|
Michel Fellmann, Leading Indicators of Past Financial Crises - Do they Apply to the Current Financial Crisis?, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2010. (Master's Thesis)
|
|
Reto Bolliger, Analysis of the determinants of yield spreads in European prime office real estate markets, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2010. (Bachelor's Thesis)
|
|
Romain Paratte, Die Stellung des Aktionärs in Theorie und Praxis, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2010. (Bachelor's Thesis)
|
|
Rebekka Rüegg, The Impact of CDS Liquidity on Credit Risk, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2010. (Bachelor's Thesis)
|
|
Marc Bourquin, Der Einfluss der Marktliquidität auf die Verzinsung von Pfandbriefen, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2010. (Bachelor's Thesis)
|
|
Thorsten Hens, Finanzplatz Schweiz - quo vadis?, In: Unijournal: Die Zeitung der Universität Zürich, 1, p. 8, 1 February 2010. (Newspaper Article)
|
|
Susanne Suter, C P E Zollikofer, Renato Pajarola, Multiscale Tensor Approximation for Volume Data, Department of Informatics, University of Zurich, Zurich, 2010-02. (Book/Research Monograph)
Advanced 3D microstructural analysis in natural sciences and engineering depends ever more on modern data acquisition and imaging technologies such as micro-computed or synchrotron tomography and interactive visualization. The acquired high-resolution volume data sets have sizes in the order of tens to hundreds of GBs, and typically exhibit spatially complex internal structures. Such large structural volume data sets represent a grand challenge to be explored, analyzed and interpreted by means of interactive visualization, since the amount of data to be rendered is typically far beyond the current performance limits of interactive graphics systems. As a new approach to tackle this bottleneck problem, we employ higher-order tensor approximations (TAs). We demonstrate the power of TA to represent, and focus on, structural features in volume data. We show that TA yields a high data reduction at competitive rate distortion and that, at the same time, it provides a natural means for multiscale volume feature representation. |
|
N Augsten, Michael Hanspeter Böhlen, J Gamper, The pq-gram distance between ordered labeled trees, ACM Transactions on Database Systems (TODS), Vol. 35 (1), 2010. (Journal Article)
When integrating data from autonomous sources, exact matches of data items that represent the same real-world object often fail due to a lack of common keys. Yet in many cases structural information is available and can be used to match such data. Typically the matching must be approximate since the representations in the sources differ. We propose pq-grams to approximately match hierarchical data from autonomous sources and define the pq-gram distance between ordered labeled trees as an effective and efficient approximation of the fanout weighted tree edit distance. We prove that the pq-gram distance is a lower bound of the fanout weighted tree edit distance and give a normalization of the pq-gram distance for which the triangle inequality holds. Experiments on synthetic and real-world data (residential addresses and XML) confirm the scalability of our approach and show the effectiveness of pq-grams. |
|
Helmut Max Dietl, M Grossmann, M Lang, Revenue sharing and competitive balance in a dynamic contest model, Review of Industrial Organization, Vol. 36 (1), 2010. (Journal Article)
This paper presents a dynamic model of talent investments in a team sports league with an infinite time horizon. We show that the clubs’ investment decisions and the effects of revenue sharing on competitive balance depend on the following three factors: (i) the cost function of talent investments, (ii) the clubs’ market sizes, and (iii) the initial endowments of talent stock. We analyze how these factors interact in the transition to the steady state as well as in the steady state itself. |
|
Helmut Max Dietl, M Lang, S Werner, The effect of luxury taxes on competitive balance, club profits, and social welfare in sports leagues, International Journal of Sport Finance, Vol. 5 (1), 2010. (Journal Article)
This paper presents a model of a professional sports league and analyzes the effect of luxury taxes on competitive balance, club profits, and social welfare. It shows that a luxury tax increases aggregate salary payments in the league and produces a more balanced league. Moreover, a higher tax rate increases the profits of large-market clubs, whereas the profits of small-market clubs only increase if the tax rate is not set inadequately high. Finally, we show that social welfare increases with a luxury tax. |
|
M Guitart-Masip, N Bunzeck, Klaas Enno Stephan, R J Dolan, E Düzel, Contextual novelty changes reward representations in the striatum, Journal of Neuroscience, Vol. 30 (5), 2010. (Journal Article)
Reward representation in ventral striatum is boosted by perceptual novelty, although the mechanism of this effect remains elusive. Animal studies indicate a functional loop (Lisman and Grace, 2005) that includes hippocampus, ventral striatum, and midbrain as being important in regulating salience attribution within the context of novel stimuli. According to this model, reward responses in ventral striatum or midbrain should be enhanced in the context of novelty even if reward and novelty constitute unrelated, independent events. Using fMRI, we show that trials with reward-predictive cues and subsequent outcomes elicit higher responses in the striatum if preceded by an unrelated novel picture, indicating that reward representation is enhanced in the context of novelty. Notably, this effect was observed solely when reward occurrence, and hence reward-related salience, was low. These findings support a view that contextual novelty enhances neural responses underlying reward representation in the striatum and concur with the effects of novelty processing as predicted by the model of Lisman and Grace (2005). |
|
C Lamm, A N Meltzoff, J Decety, How do we empathize with someone who is not like us? A functional magnetic resonance imaging study, Journal of Cognitive Neuroscience, Vol. 22 (2), 2010. (Journal Article)
Abstract Previous research on the neural underpinnings of empathy has been limited to affective situations experienced in a similar way by an observer and a target individual. In daily life we also interact with people whose responses to affective stimuli can be very different from our own. How do we understand the affective states of these individuals? We used functional magnetic resonance imaging to assess how participants empathize with the feelings of patients who reacted with no pain to surgical procedures but with pain to a soft touch. Empathy for pain of these patients activated the same areas (insula, medial/anterior cingulate cortex) as empathy for persons who responded to painful stimuli in the same way as the observer. Empathy in a situation that was aversive only for the observer but neutral for the patient recruited areas involved in self-other distinction (dorsomedial prefrontal cortex) and cognitive control (right inferior frontal cortex). In addition, effective connectivity between the latter and areas implicated in affective processing was enhanced. This suggests that inferring the affective state of someone who is not like us can rely upon the same neural structures as empathy for someone who is similar to us. When strong emotional response tendencies exist though, these tendencies have to be overcome by executive functions. Our results demonstrate that the fronto-cortical attention network is crucially involved in this process, corroborating that empathy is a flexible phenomenon which involves both automatic and controlled cognitive mechanisms. Our findings have important implications for the understanding and promotion of empathy, demonstrating that regulation of one's egocentric perspective is crucial for understanding others. |
|
C H Kasess, Klaas Enno Stephan, A Weissenbacher, L Pezawas, E Moser, C Windischberger, Multi-subject analyses with dynamic causal modeling, NeuroImage, Vol. 49 (4), 2010. (Journal Article)
Currently, most studies that employ dynamic causal modeling (DCM) use random-effects (RFX) analysis to make group inferences, applying a second-level frequentist test to subjects' parameter estimates. In some instances, however, fixed-effects (FFX) analysis can be more appropriate. Such analyses can be implemented by combining the subjects' posterior densities according to Bayes' theorem either on a multivariate (Bayesian parameter averaging or BPA) or univariate basis (posterior variance weighted averaging or PVWA), or by applying DCM to time-series averaged across subjects beforehand (temporal averaging or TA). While all these FFX approaches have the advantage of allowing for Bayesian inferences on parameters a systematic comparison of their statistical properties has been lacking so far. Based on simulated data generated from a two-region network we examined the effects of signal-to-noise ratio (SNR) and population heterogeneity on group-level parameter estimates. Data sets were simulated assuming either a homogeneous large population (N=60) with constant connectivities across subjects or a heterogeneous population with varying parameters. TA showed advantages at lower SNR but is limited in its applicability. Because BPA and PVWA take into account posterior (co)variance structure, they can yield non-intuitive results when only considering posterior means. This problem is relevant for high SNR data, pronounced parameter interdependencies and when FFX assumptions are violated (i.e. inhomogeneous groups). It diminishes with decreasing SNR and is absent for models with independent parameters or when FFX assumptions are appropriate. Group results obtained with these FFX approaches should therefore be interpreted carefully by considering estimates of dependencies among model parameters. |
|
Jacob Goeree, Margaret A McConnell, Tiffany Mitchell, Tracey Tromp, Leeat Yariv, The 1/d law of giving, American Economic Journal: Microeconomics, Vol. 2 (1), 2010. (Journal Article)
We combine survey data on friendship networks and individual characteristics with experimental observations from dictator games. Dictator offers are primarily explained by social distance - giving follows a simple inverse distance law. While student demographics play a minor role in explaining offer amounts, individual heterogeneity is important for network formation. In particular, we detect significant homophilous behavior - students connect to others similar to them. Moreover, the network data reveal a strong preference for cliques - students connect to those already close. The study is one of the first to identify network architecture with individual behavior in a strategic context. |
|
Boris Glavic, Formal Foundation of Contribution Semantics and Provenance Computation through Query Rewrite in TRAMP, February 2010. (Other Publication)
In this report we present the theoretical foundation of TRAMP. TRAMP is a schema mapping debugging system that uses provides provenance and query support as debugging functality for schema mappings scenarios. TRAMP is an extension of Perm, a relational provenance management system developed at University of Zurich. In this report we are not focussing on the debugging functionality added by TRAMP, but instead focus on the theoretical foundation of the provenance types provided by the system. In chapter 2 we present the contribution semantics for data provenance, transformation provenance, and mapping provenance used by TRAMP. Contribution semantics define which parts of the input (in case of data provenance) and which operators of a transformation (in case of transformation provenance) belong to the provenance of an output of a transformation. Thus, contribution semantics define ``what provenance actually is''. Based on the presented contribution semantics we demonstrate in chapter 3 how provenance according to these provenance types can be computed using algebraic rewrite techniques and proof the correctness and completeness of the algorithms used to compute provenance. |
|
Alexander Schäfer, Investigation and Evaluation of Distributed Storage Strategies for Video Streaming Application, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2010. (Bachelor's Thesis)
Video peer to peer streaming and video on demand applications bring new challenges to storing data. Such an application is LiveShift which is currently being developed at the Communication Systems Group of University of Zurich and includes retrieval and long term storage of video data. However it doesn't yet remove data, and in this thesis storage strategies for video streaming and demand in peer to peer environments are compared, and cache replacement policies are selected, modified and extended to work optimal with LiveShift. Changes to LiveShift were made, \emphe.g. introducing the notion of intervals to avoid fragmentation. The thesis should have concluded with selecting a optimal cache algorithms showing the best trade-off between hit/miss-rate and delay, but due to statistical errors the comparison did not show any favorite. Instead the characteristics of the algorithms were confirmed in specifically tailored scenarios showing off specific advantages. |
|