Markus Leippold, Institutionelle Investoren und Aktienrenditen: Der steigende Ein uss von Benchmarks auf Risikopramien, 2011. (Other Publication)
|
|
Markus Leippold, CoCo Bonds als regulatorisches Kapital, 2011. (Other Publication)
|
|
Ziwei Yang, Shen Gao, Jianliang Xu, Byron Choi, Authentication of range query results in mapreduce environments, In: Proceedings of the third international workshop on Cloud data management, ACM, New York, NY, USA, 2011. (Conference or Workshop Paper published in Proceedings)
|
|
Shen Gao, Jianliang Xu, Bingsheng He, Byron Choi, Haibo Hu, PCMLogging: reducing transaction logging overhead with PCM, In: 20th ACM international conference on Information and knowledge management, ACM, New York, NY, USA, 2011-01-01. (Conference or Workshop Paper published in Proceedings)
|
|
Robert Göx, Innerbetriebliche Verrechnungspreise zur Koordination von Handels- und Investitionsanreizen: Kommentar zum Beitrag von Clemens Löffler, Thomas Pfeiffer, Ulf Schiller, Schmalenbachs Zeitschrift für betriebswirtschaftliche Forschung, Vol. 2011 (63), 2011. (Journal Article)
|
|
Robert Göx, Die Höhe der Managerlöhne in grossen Schweizer Publikumsaktiengesellschaften: Problemfall oder drohende Überregulierung?, Die Unternehmung, Vol. 2011 (1), 2011. (Journal Article)
|
|
Oliver M Dürr, Robert Göx, Strategic incentives for keeping one set of books in international transfer pricing, Journal of Economics and Management Strategy, Vol. 20 (1), 2011. (Journal Article)
|
|
Thomas Keil, Tomi Laamanen, When rivals merg, think before you follow suit, Harvard Business Review, Vol. 89 (2), 2011. (Journal Article)
Many companies react to competitors’ acquisition sprees refl exively, by launching bids of their own. Smart managers should consider other moves. |
|
Yuval Deutsch, Thomas Keil, Tomi Laamanen, A dual agency view of board compensation: the joint effects of outside director and CEO stock options on firm risk, Strategic Management Journal, Vol. 32 (2), 2011. (Journal Article)
|
|
M. Lang, M. Grossmann, P. Theiler, The Sugar Daddy Game: How Wealthy Investors Change Competition in Professional Team Sports, Journal of Institutional and Theoretical Economics, 2011. (Journal Article)
|
|
Taichi Haruna, Kohei Nakajima, Permutation complexity via duality between values and orderings, Physica D: Nonlinear Phenomena, 2011. (Journal Article)
|
|
Kohei Nakajima, Taichi Haruna, Self-organized perturbations enhance class IV behavior and 1/f power spectrum in elementary cellular automata, Molecular Biosystems, Vol. 105 (3), 2011. (Journal Article)
|
|
Jonas Tappolet, Managing Temporal Graph Data While Preserving Semantics, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2011. (Dissertation)
This thesis investigates the introduction of time as a first-class citizen to RDF-based knowledge bases as used by the Linked Data movement. By presenting EvoOnt, a use-case scenario from the field of software comprehension we demonstrate a particular field that (1) benefits from the Semantic Web’s tools and techniques, (2) has a high update rate and (3) is a candidate-dataset for Linked Data. EvoOnt is a set of OWL ontologies that cover three aspects of the software development process: A source code ontology that abstracts the elements of object-oriented code, a defect tracker ontology that models the contents of a defect database (a.k.a. bug tracker) and finally a version ontology that allows the expression of multiple versions of a source code file. In multiple experiment we demonstrate how Semantic Web tools and techniques can be used to perform common tasks known from software comprehension. Derived from this use case we show how the temporal dimension can be leveraged in RDF data. Firstly, we present a representation format for the annotation of RDF triples with temporal validity intervals. We propose a special usage of named graphs in order to encode temporal triples. Secondly, we demonstrate how such a knowledge base can be queried using a temporal syntax extension of the SPARQL query language. Next, we present two indexing structures that speed up the processing and querying time of temporally annotated data. Furthermore, we demonstrate how additional knowledge can be extracted from the temporal dimension by matching patterns that contain temporal constraints. All those elements put together outlines a method that can be used to make the datasets published as Linked Data more robust to possible invalidations through updates of liked datasets. Additionally, processing and querying can be improved through sophisticated index structures while deriving additional information from the history of a dataset. |
|
Helmut Hauser, Auke J Ijspeert, Rudolf M Füchslin, Rolf Pfeifer, Wolfgang Maass, Towards a theoretical foundation for morphological computation with compliant bodies, Biological Cybernetics, Vol. 105 (5-6), 2011. (Journal Article)
The control of compliant robots is, due to their often nonlinear and complex dynamics, inherently difficult. The vision of morphological computation proposes to view these aspects not only as problems, but rather also as parts of the solution. Non-rigid body parts are not seen anymore as imperfect realizations of rigid body parts, but rather as potential computational resources. The applicability of this vision has already been demonstrated for a variety of complex robot control problems. Nevertheless, a theoretical basis for understanding the capabilities and limitations of morphological computation has been missing so far. We present a model for morphological computation with compliant bodies, where a precise mathematical characterization of the potential computational contribution of a complex physical body is feasible. The theory suggests that complexity and nonlinearity, typically unwanted properties of robots, are desired features in order to provide computational power. We demonstrate that simple generic models of physical bodies, based on mass-spring systems, can be used to implement complex nonlinear operators. By adding a simple readout (which is static and linear) to the morphology such devices are able to emulate complex mappings of input to output streams in continuous time. Hence, by outsourcing parts of the computation to the physical body, the difficult problem of learning to control a complex body, could be reduced to a simple and perspicuous learning task, which can not get stuck in local minima of an error function. |
|
William McKinley, Matthew S Wood, Gyewan Moon, Low heed in organization theory, M@n@gement, Vol. 14 (3), 2011. (Journal Article)
This essay borrows the construct of “heedful interrelating” from Weick and Roberts’s (1993) study of aircraft carrier flight decks, and uses the construct to analyze the social processes that structure contemporary scholarship in organization theory. We argue that organization theory often operates as a low-heed discipline, in which scholars take minimal heed of the contributions of their fellows. This condition of low heed is revealed in several specific aspects of the discipline: lack of attention to testing previously published theories, lack of emphasis on replication of published empirical research, low standardization of construct definition and measurement, and a minimally developed division of labor between theorists and empirical researchers. We explore the causes of this low-heed state in contemporary organization theory, and we also enumerate some advantages of low heed in the discipline. We devote attention to the effects of low heed on the training of newcomers to the field, and we argue that doctoral education in organization theory is both an effect and a cause of low heed. Finally, we offer some suggestions for incorporating more scholarly heed into organization theory without destroying the major advantage of a low-heed discipline – freedom of inquiry. We also indicate how a cautious increase of heedful interrelating in organization theory might improve the perceived relevance of its research results for management practice. |
|
Katharina Dittrich, Stéphane Guérard, David Seidl, Meetings in the strategy process: toward an integrative framework, Academy of Management. Proceedings (1), 2011. (Journal Article)
During the last three decades, scholars from communication studies, political science, sociology, cultural anthropology and management science have studied the characteristics and dynamics of meetings from different perspectives. This has resulted in a large, though very fragmented, body of knowledge about meetings and their different functions in the organization. So far, however, this knowledge has not been systematically related to the strategy process. The purpose of this review is to organize the different literatures by identifying the meeting functions (coordination, cognitive, political, symbolic and social) as well as the meeting practices (initiation, conduct and termination practices) and by outlining the impact of meetings on the strategy process. This results in an integrative framework which synthesizes the literature and which serves as a guide for future research. |
|
R. Käppeli, S. C. Whitehouse, Simon Scheidegger, U.-L. Pen, M. Liebendörfer, FISH: A THREE-DIMENSIONAL PARALLEL MAGNETOHYDRODYNAMICS CODE FOR ASTROPHYSICAL APPLICATIONS, The Astrophysical Journal Supplement Series, Vol. 195 (2), 2011. (Journal Article)
FISH is a fast and simple ideal magnetohydrodynamics code that scales to ~10,000 processes for a Cartesian computational domain of ~10003 cells. The simplicity of FISH has been achieved by the rigorous application of the operator splitting technique, while second-order accuracy is maintained by the symmetric ordering of the operators. Between directional sweeps, the three-dimensional data are rotated in memory so that the sweep is always performed in a cache-efficient way along the direction of contiguous memory. Hence, the code only requires a one-dimensional description of the conservation equations to be solved. This approach also enables an elegant novel parallelization of the code that is based on persistent communications with MPI for cubic domain decomposition on machines with distributed memory. This scheme is then combined with an additional OpenMP parallelization of different sweeps that can take advantage of clusters of shared memory. We document the detailed implementation of a second-order total variation diminishing advection scheme based on flux reconstruction. The magnetic fields are evolved by a constrained transport scheme. We show that the subtraction of a simple estimate of the hydrostatic gradient from the total gradients can significantly reduce the dissipation of the advection scheme in simulations of gravitationally bound hydrostatic objects. Through its simplicity and efficiency, FISH is as well suited for hydrodynamics classes as for large-scale astrophysical simulations on high-performance computer clusters. In preparation for the release of a public version, we demonstrate the performance of FISH in a suite of astrophysically orientated test cases. |
|
Ekaterina Kuleshova, Provenance in temporal databases: Facharbeit, 2011. (Other Publication)
The purpose of this paper is to develop tracing of lineage and provenance techniques for temporal databases. Using the snapshot reducibility property of temporal databases we will define a pointwise lineage traceability for temporal databases. Merging time points with the same lineage in the result of temporal operators allows an interval-based model by still allowing lineage traceability. On examples we show the algebra and it lineage. To trace lineage we need to materialize intermediate results. Moreover, lineage tells us only about the tuples that contribute and not how they contribute to the result query. That is why we further define relations annotated with provenance semirings. To be able to to perform queries on such relations we generalize the algebra to operate on them, so that query execution propagates provenance information. Finally, we define positive relational algebra, which propagates how-provenance.
|
|
Stefan Gassmann, Die Ökonomie der Firma: Theorie und empirische Evidenz», University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2011. (Bachelor's Thesis)
|
|
Philipp Schlegel, Automatic transfer function generation and extinction- based approaches in direct volume visualization, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2011. (Dissertation)
Direct volume visualization has become an important tool in many domains for visualizing and examining volumetric datasets. The tremendous increase in computing power of the hardware over the past years makes it possible to immediately visualize volumetric datasets obtained from scanning devices at fully interactive frame rates. However, despite this change of paradigm compared to the slow offline methods of the past, direct volume visualization suffers from disadvantages constricting an immediate, reliable analysis of volumetric datasets.This thesis begins with an overview of different methods for direct volume visualization followed by an in-depth review of the theoretical foundation including inherent challenges. Subsequently selected state-of-the-art techniques used in this thesis are explained in detail. One challenge that all techniques have in common is the dependency on good transfer functions. Only good transfer functions allow for the right insight into the dataset permitting a reliable analysis. These transfer functions are often constructed manually in a time consuming and cumbersome trial-and-error process. We propose an automated general purpose approach for generating a set of best transfer functions based on information theory. Our algorithm appraises the information content of the images generated by a particular transfer function when rotating the dataset, as it is the case in interactive sessions. Quantifying the quality of a transfer function in this way enables a directed search for the set of best transfer functions in a feedback loop employing a combination of two different optimization algorithms. This set of best, distinct transfer functions helps the user to gain an immediate overview of each facet of a dataset.When visualizing volumetric datasets, it is of major importance that domain experts are able to recognize small features, to distinguish the relationship and connectivity between them and to get the right perception. For this the applied illumination and shading model plays an important part. Sophisticated models including realistic looking directional shadows, ambient occlusion and color bleeding effects can greatly enhance the perception. Unfortunately common models exhibiting these effects are expensive to compute and not suitable for interactive applications. We present a method showing how these effects can be applied to GPU volume ray-casting while fully maintaining interactivity based on the original, exponential extinction coefficient of the volume rendering integral. Exploiting the fact that the original, exponential extinction coefficient is summable, our framework is built on top of a 3D summed area table that allows for quick lookups of extinction queries.Technically volumetric datasets consist of discrete scalar or sometimes vector data. As the resolution of this data hardly ever fits the resolution of the output device, the data needs to be interpolated or reconstructed. Volume visualization methods based on 3D textures can profit from fast built-in trilinear interpolation of the hardware. However, trilinear interpolation is not the first choice when it comes to image quality. Volume splatting on the other hand is a volume visualization technique that makes it easy to integrate arbitrary interpolation schemes. The performance of volume splatting is directly related to the applied interpolation scheme and the resulting interpolation kernel respectively. In this thesis we introduce an algorithm for volume splatting that greatly enhances the performance by reducing the required amount of splatting operations from interpolation kernel slices. Further, we show how the image quality of volume visualization can be enhanced by using the original, exponential extinction coefficient of the volume rendering integral instead of common alpha-blending simplifications. |
|