Maxim Makhinya, S Eilemann, Renato Pajarola, Fast compositing for cluster-parallel rendering, In: Eurographics Symposium on Parallel Graphics and Visualization, 2010-05-02. (Conference or Workshop Paper published in Proceedings)
The image compositing stages in cluster-parallel rendering for gathering and combining partial rendering results into a final display frame are fundamentally limited by node-to-node image throughput. Therefore, efficient image coding, compression and transmission must be considered to minimize that bottleneck. This paper studies the different performance limiting factors such as image representation, region-of-interest detection and fast image compression. Additionally, we show improved compositing performance using lossy YUV subsampling and we propose a novel fast region-of-interest detection algorithm that can improve in particular sort-last parallel rendering. |
|
A Lamkanfi, S Demeyer, Emanuel Giger, B Goethals, Predicting the severity of a reported bug, In: 7th Working Conference on Mining Software Repositories, 2010-05-02. (Conference or Workshop Paper published in Proceedings)
The severity of a reported bug is a critical factor in deciding how soon it needs to be fixed. Unfortunately, while clear guidelines exist on how to assign the severity of a bug, it remains an inherent manual process left to the person reporting the bug. In this paper we investigate whether we can accurately predict the severity of a reported bug by analyzing its textual description using text mining algorithms. Based on three cases drawn from the open-source community (Mozilla, Eclipse and GNOME), we conclude that given a training set of sufficient size (approximately 500 reports per severity), it is possible to predict the severity with a reasonable accuracy (both precision and recall vary between 0.65-0.75 with Mozilla and Eclipse; 0.70-0.85 in the case of GNOME). |
|
Prashant Goswami, Maxim Makhinya, Jonas Bösch, Renato Pajarola, Scalable parallel out-of-core terrain rendering, In: Eurographics Symposium on Parallel Graphics and Visualization, 2010-05-02. (Conference or Workshop Paper published in Proceedings)
In this paper, we introduce a novel out-of-core parallel and scalable technique for rendering massive terrain datasets. The parallel rendering task decomposition is implemented on top of an existing terrain renderer using an open source framework for cluster-parallel rendering. Our approach achieves parallel rendering by division of the rendering task either in sort-last (database) or sort-first (screen domain) manner and presents an optimal method for implicit load balancing in the former mode. The efficiency of our approach is validated using massive elevation models. |
|
T Reinhard, Martin Glinz, Automatic placement of link labels in diagrams, In: ICSE 2010 Workshop on Flexible Modeling Tools (FlexiTools 2010), 2010-05-02. (Conference or Workshop Paper published in Proceedings)
While diagrams play a central role in the software lifecycle,
computer-based modeling tools are often not used in practice.
At least to some extent, this is due to their lack of
exibility and support of recurring and tedious \model administration"
tasks. Such tasks, like the rearrangement of
existing model elements to provide the space required by a
new element or the manual adjustment of lines after moving
an element, distract the user from the actual modeling
activities.
In this paper, we present an approach to relieve the modeler
from the painful manual placement of the labels accompanying
the links in a diagram by handing this task over to
the modeling tool. Additionally, we give a short overview
over other modeling activities that come along with the creation
and manipulation of a diagram but should be handled
by the tool and not the user. |
|
Michael Würsch, Giacomo Ghezzi, G Reif, H C Gall, Supporting developers with natural language queries, In: 32nd ACM/IEEE International Conference on Software Engineering, 2010-05-02. (Conference or Workshop Paper published in Proceedings)
The feature list of modern IDEs is growing steadily and mastering these tools becomes more and more demanding, especially for novice programmers. Despite their remarkable capabilities, IDEs often still cannot directly answer the questions that arise during program comprehension tasks. Instead developers have to map their questions to multiple concrete queries that can be answered only by combining several tools and examining the output of each of them manually to distill an appropriate answer. Existing approaches have in common that they are either limited to a set of predefined, hardcoded questions, or that they require to learn a specific query language only suitable for that limited purpose. We present a framework to query for information about a software system using guided-input natural language resembling plain English. For that, we model data extracted by classical software analysis tools with an OWL ontology and use knowledge processing technologies from the Semantic Web to query it. We also present a case study that demonstrates how our framework can be used to answer queries about static source code information for program comprehension purposes. |
|
Alessandro Scopelliti, Competition And Economic Growth: A Critical Survey Of The Theoretical Literature, Journal of Applied Economic Sciences, Vol. 5 (11), 2010. (Journal Article)
The paper examines the relationship between competition and economic growth, in the theoretical framework described by endogenous growth models, but with a specific interest in the policy implications. In this perspective, the key issue in the debate can be presented as follows: do competition policies always create the best conditions for promoting innovation and growth? Or do they also produce some disincentives for the investment decisions in R&D, such to limit the development of industries with higher innovation? In order to answer these questions, the paper presents a survey of the theoretical literature on competition and growth and it discusses the main models of endogenous growth, both the ones based on horizontal innovation, and the ones based on vertical innovation. In particular, specific attention is paid to the most recent models of Schumpeterian growth, which show the existence of a non-linear relationship between competition and growth, by considering either the initial degree of competition or the distance from the technological frontier. Finally, the review of the previous models of endogenous growth allows drawing some conclusions about further and possible developments of research on the relation between product market competition and economic growth. |
|
Lukas Kress, Perks and Headquarter Location, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2010. (Master's Thesis)
|
|
David Gilhawley, Value Creation of Bank Mergers and the Shareholders' View on Diversification, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2010. (Bachelor's Thesis)
|
|
Michael Würsch, G Reif, S Demeyer, H C Gall, Fostering synergies - how semantic web technology could influence software repositories, In: 2nd International Workshop on Search-driven Development: Users, Infrastructure, Tools and Evaluation, 2010-05-01. (Conference or Workshop Paper published in Proceedings)
The state-of-the-art in mining software repositories mirrors software artifacts from various sources into monolithic relational databases. This puts a lot of querying power in the hands of the software miners, however it comes at the cost of enclosing the data and hamper cross-application reuse. In this paper we discuss four problem scenarios to illustrate that Semantic Web technology is able to overcome these limitations. However, it requires that the software engineering research community agrees on two prerequisites: (a) a common vocabulary to talk about software repositories -- an ontology; (b) a strategy for generating unique and stable references to all software artifacts inside such a repository - a Universal Resource Identifier (URI). |
|
M V J Heikkinen, T Casey, Fabio Victora Hecht, Value analysis of centralized and distributed communications and video streaming, Info (Bingley) - The journal of policy, regulation and strategy for telecommunications, information and media, 2010. (Journal Article)
Purpose – When comparing novel centralized and distributed communications and video streaming services, the authors identified a need for a theoretic framework to position a multitude of ICT services and technologies according to their value proposition. Literature does not integrate existing value analysis concepts into a holistic theoretical framework. This paper aims to address this shortcoming by proposing a value analysis framework for ICT services capable of describing the value exchanges between different actors and their role constellations based on technological componentizations.
Design/methodology/approach – The paper evaluates a representative selection of communications and video streaming services and an extensive literature study on existing value analysis research was conducted to develop the framework and to verify it.
Findings – The paper demonstrates the applicability of the value analysis framework in communications and video streaming case studies, which are technically very different from each other but, at the abstraction level the framework provides, display very similar characteristics in value flows and role constellations.
Research limitations/implications – The value analysis framework could be extended and verified with other case studies and complemented with quantitative modeling and system dynamics.
Originality/value – The authors combine existing literature into a proposal of a holistic value analysis framework and apply it to novel centralized and distributed communications and video streaming services. Both academics and practitioners can use the framework to evaluate the value proposition of ICT services and technologies. |
|
Maik Dierkes, Carsten Erner, Stefan Zeisberger, Investment horizon ant the attractiveness of investment strategies: A behavioral approach, Journal of Banking and Finance, Vol. 34 (5), 2010. (Journal Article)
We analyze the attractiveness of investment strategies over a variety of investment horizons from the viewpoint of an investor with preferences described by Cumulative Prospect Theory (CPT), currently the most prominent descriptive theory for decision making under uncertainty. A bootstrap technique is applied using historical return data of 1926–2008. To allow for variety in investors’ preferences, we conduct several sensitivity analyses and further provide robustness checks for the results. In addition, we analyze the attractiveness of the investment strategies based on a set of experimentally elicited preference parameters. Our study reveals that strategy attractiveness substantially depends on the investment horizon. While for almost every preference parameter combination a bond strategy is preferred for the short run, stocks show an outperformance for longer horizons. Portfolio insurance turns out to be attractive for almost every investment horizon. Interestingly, we find probability weighting to be a driving factor for insurance strategies’ attractiveness. |
|
Jean-Charles Rochet, Doh-Shin Jeon, The pricing of academic journals: a two-sided market perspective, American Economic Journal: Microeconomics, Vol. 2 (2), 2010. (Journal Article)
More and more academic journals are adopting an open access policy by which articles are accessible free of charge, while publication costs are recovered through author fees. We study the consequences of this open access policy on the quality standard of an electronic academic journal. If the journal's objective were to maximize social welfare, open access would be optimal. However, we show that if the journal has a different objective (such as maximizing readers' utility, the impact of the journal, or its profit), open access tends to induce it to choose a quality standard below the socially efficient level. |
|
Bruno Frey, Katja Rost, Do rankings reflect research quality?, Journal of Applied Economics, Vol. 13 (1), 2010. (Journal Article)
Publication and citation rankings have become major indicators of the scientific worth of universities and determine to a large extent the career of individual scholars. Such rankings do not effectively measure research quality, which should be the essence of any evaluation. These quantity rankings are not objective; two citation rankings, based on different samples, produce entirely different results. For that reason, an alternative ranking is developed as a quality indicator, based on membership on academic editorial boards of professional journals. It turns out that the ranking of individual scholars based on that measure is far from objective. Furthermore, the results differ markedly, depending on whether research quantity or quality is considered. Thus, career decisions based on rankings are dominated by chance and do not reflect research quality. We suggest that evaluations should rely on multiple criteria. Public management should return to approved methods such as engaging independent experts who in turn provide measurements of research quality for their research communities. |
|
Egon Franck, Private firm, public corporation or member's association – Governance structures in European football, International Journal of Sport Finance, Vol. 5 (2), 2010. (Journal Article)
Based on the analysis of the specific industry environment in which football clubs compete, this paper presents a comparative institutional analysis of three paradigmatic structures of football club governance: classical (privately owned) football firms, modern football corporations (stock corporations with dispersed ownership) and members’ associations with an own legal personality (Verein). The results of the analysis are applied to current developments in German and English football and to recent initiatives of the Football Governing Bodies. |
|
Klaas Enno Stephan, K J Friston, Analyzing effective connectivity with fMRI, Wiley Interdisciplinary Reviews: Cognitive Science, Vol. 1 (3), 2010. (Journal Article)
Functional neuroimaging techniques are used widely in cognitive neuroscience to investigate aspects of functional specialization and functional integration in the human brain. Functional integration can be characterized in two ways, functional connectivity and effective connectivity. While functional connectivity describes statistical dependencies between data, effective connectivity rests on a mechanistic model of the causal effects that generated the data. This review addresses the conceptual and methodological basis of established techniques for characterizing effective connectivity using functional magnetic resonance imaging (fMRI) data. In particular, we focus on dynamic causal modeling (DCM) of fMRI data and emphasize the importance of model selection procedures and nonlinear mechanisms for context-dependent changes in connection strengths. |
|
Màrton Takàcs, Analyse des Verhaltens von Microsoft SQL Server unter verschiedenen Workloads, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2010. (Bachelor's Thesis)
The work presented here describes measurements of the transaction throughput
for Microsoft SQL Server 2008 which focuses on concurrency control. Two types
of self-defined workloads and a TPC benchmark were used for the measurements.
In the measurements, the parameters isolation level, amount of operations per
transaction, size of the used database and the ratio between read and write
operations were changed to determine their effect on the throughput, the
effort of multi-user synchronization, and the aborts. The amount of simulated
users was raised. The results of the measurements are explained based on
theoretical principles. Because of the expansion of the effort of locking and
the data basis the transaction throughput and the effort of the
multi-user synchronization reduces/increases up to a certain number of users
relative proportionally and exploit beyond this point. |
|
P Mahler, Energie, Hingabe und Enthusiasmus, In: Neue Zürcher Zeitung, 99, p. 97, 30 April 2010. (Newspaper Article)
|
|
Thorsten Hens, Abzocker-Initiative, wozu?, In: Finanz und Wirtschaft, 33, p. 1, 28 April 2010. (Newspaper Article)
|
|
Cosmin Basca, Abraham Bernstein, R H Warren, Canopener: recycling old and new data, In: 3rd Workshop on Mashups, Enterprise Mashups and Lightweight Composition on the Web (MEM 2010), 2010-04-26. (Conference or Workshop Paper published in Proceedings)
The advent of social markup languages and lightweight public data access methods has created an opportunity to share the social, documentary and system information locked in most servers as a mashup. Whereas solutions already exists for creating and managing mashups from network sources, we propose here a mashup framework whose primary information sources are the applications and user files of a server. This enables us to use server legacy data sources that are already maintained as part of basic administration to semantically link user documents and accounts using social web constructs. |
|
Bruno Staffelbach, Ökonomie ist mehr als nur Franken zählen, In: Neue Zürcher Zeitung, 94, p. 75, 24 April 2010. (Newspaper Article)
|
|