David Gilhawley, Value Creation of Bank Mergers and the Shareholders' View on Diversification, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2010. (Bachelor's Thesis)
|
|
Michael Würsch, G Reif, S Demeyer, H C Gall, Fostering synergies - how semantic web technology could influence software repositories, In: 2nd International Workshop on Search-driven Development: Users, Infrastructure, Tools and Evaluation, 2010-05-01. (Conference or Workshop Paper published in Proceedings)
The state-of-the-art in mining software repositories mirrors software artifacts from various sources into monolithic relational databases. This puts a lot of querying power in the hands of the software miners, however it comes at the cost of enclosing the data and hamper cross-application reuse. In this paper we discuss four problem scenarios to illustrate that Semantic Web technology is able to overcome these limitations. However, it requires that the software engineering research community agrees on two prerequisites: (a) a common vocabulary to talk about software repositories -- an ontology; (b) a strategy for generating unique and stable references to all software artifacts inside such a repository - a Universal Resource Identifier (URI). |
|
M V J Heikkinen, T Casey, Fabio Victora Hecht, Value analysis of centralized and distributed communications and video streaming, Info (Bingley) - The journal of policy, regulation and strategy for telecommunications, information and media, 2010. (Journal Article)
Purpose – When comparing novel centralized and distributed communications and video streaming services, the authors identified a need for a theoretic framework to position a multitude of ICT services and technologies according to their value proposition. Literature does not integrate existing value analysis concepts into a holistic theoretical framework. This paper aims to address this shortcoming by proposing a value analysis framework for ICT services capable of describing the value exchanges between different actors and their role constellations based on technological componentizations.
Design/methodology/approach – The paper evaluates a representative selection of communications and video streaming services and an extensive literature study on existing value analysis research was conducted to develop the framework and to verify it.
Findings – The paper demonstrates the applicability of the value analysis framework in communications and video streaming case studies, which are technically very different from each other but, at the abstraction level the framework provides, display very similar characteristics in value flows and role constellations.
Research limitations/implications – The value analysis framework could be extended and verified with other case studies and complemented with quantitative modeling and system dynamics.
Originality/value – The authors combine existing literature into a proposal of a holistic value analysis framework and apply it to novel centralized and distributed communications and video streaming services. Both academics and practitioners can use the framework to evaluate the value proposition of ICT services and technologies. |
|
Maik Dierkes, Carsten Erner, Stefan Zeisberger, Investment horizon ant the attractiveness of investment strategies: A behavioral approach, Journal of Banking and Finance, Vol. 34 (5), 2010. (Journal Article)
We analyze the attractiveness of investment strategies over a variety of investment horizons from the viewpoint of an investor with preferences described by Cumulative Prospect Theory (CPT), currently the most prominent descriptive theory for decision making under uncertainty. A bootstrap technique is applied using historical return data of 1926–2008. To allow for variety in investors’ preferences, we conduct several sensitivity analyses and further provide robustness checks for the results. In addition, we analyze the attractiveness of the investment strategies based on a set of experimentally elicited preference parameters. Our study reveals that strategy attractiveness substantially depends on the investment horizon. While for almost every preference parameter combination a bond strategy is preferred for the short run, stocks show an outperformance for longer horizons. Portfolio insurance turns out to be attractive for almost every investment horizon. Interestingly, we find probability weighting to be a driving factor for insurance strategies’ attractiveness. |
|
Jean-Charles Rochet, Doh-Shin Jeon, The pricing of academic journals: a two-sided market perspective, American Economic Journal: Microeconomics, Vol. 2 (2), 2010. (Journal Article)
More and more academic journals are adopting an open access policy by which articles are accessible free of charge, while publication costs are recovered through author fees. We study the consequences of this open access policy on the quality standard of an electronic academic journal. If the journal's objective were to maximize social welfare, open access would be optimal. However, we show that if the journal has a different objective (such as maximizing readers' utility, the impact of the journal, or its profit), open access tends to induce it to choose a quality standard below the socially efficient level. |
|
Bruno Frey, Katja Rost, Do rankings reflect research quality?, Journal of Applied Economics, Vol. 13 (1), 2010. (Journal Article)
Publication and citation rankings have become major indicators of the scientific worth of universities and determine to a large extent the career of individual scholars. Such rankings do not effectively measure research quality, which should be the essence of any evaluation. These quantity rankings are not objective; two citation rankings, based on different samples, produce entirely different results. For that reason, an alternative ranking is developed as a quality indicator, based on membership on academic editorial boards of professional journals. It turns out that the ranking of individual scholars based on that measure is far from objective. Furthermore, the results differ markedly, depending on whether research quantity or quality is considered. Thus, career decisions based on rankings are dominated by chance and do not reflect research quality. We suggest that evaluations should rely on multiple criteria. Public management should return to approved methods such as engaging independent experts who in turn provide measurements of research quality for their research communities. |
|
Egon Franck, Private firm, public corporation or member's association – Governance structures in European football, International Journal of Sport Finance, Vol. 5 (2), 2010. (Journal Article)
Based on the analysis of the specific industry environment in which football clubs compete, this paper presents a comparative institutional analysis of three paradigmatic structures of football club governance: classical (privately owned) football firms, modern football corporations (stock corporations with dispersed ownership) and members’ associations with an own legal personality (Verein). The results of the analysis are applied to current developments in German and English football and to recent initiatives of the Football Governing Bodies. |
|
Klaas Enno Stephan, K J Friston, Analyzing effective connectivity with fMRI, Wiley Interdisciplinary Reviews: Cognitive Science, Vol. 1 (3), 2010. (Journal Article)
Functional neuroimaging techniques are used widely in cognitive neuroscience to investigate aspects of functional specialization and functional integration in the human brain. Functional integration can be characterized in two ways, functional connectivity and effective connectivity. While functional connectivity describes statistical dependencies between data, effective connectivity rests on a mechanistic model of the causal effects that generated the data. This review addresses the conceptual and methodological basis of established techniques for characterizing effective connectivity using functional magnetic resonance imaging (fMRI) data. In particular, we focus on dynamic causal modeling (DCM) of fMRI data and emphasize the importance of model selection procedures and nonlinear mechanisms for context-dependent changes in connection strengths. |
|
Màrton Takàcs, Analyse des Verhaltens von Microsoft SQL Server unter verschiedenen Workloads, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2010. (Bachelor's Thesis)
The work presented here describes measurements of the transaction throughput
for Microsoft SQL Server 2008 which focuses on concurrency control. Two types
of self-defined workloads and a TPC benchmark were used for the measurements.
In the measurements, the parameters isolation level, amount of operations per
transaction, size of the used database and the ratio between read and write
operations were changed to determine their effect on the throughput, the
effort of multi-user synchronization, and the aborts. The amount of simulated
users was raised. The results of the measurements are explained based on
theoretical principles. Because of the expansion of the effort of locking and
the data basis the transaction throughput and the effort of the
multi-user synchronization reduces/increases up to a certain number of users
relative proportionally and exploit beyond this point. |
|
P Mahler, Energie, Hingabe und Enthusiasmus, In: Neue Zürcher Zeitung, 99, p. 97, 30 April 2010. (Newspaper Article)
|
|
Thorsten Hens, Abzocker-Initiative, wozu?, In: Finanz und Wirtschaft, 33, p. 1, 28 April 2010. (Newspaper Article)
|
|
Cosmin Basca, Abraham Bernstein, R H Warren, Canopener: recycling old and new data, In: 3rd Workshop on Mashups, Enterprise Mashups and Lightweight Composition on the Web (MEM 2010), 2010-04-26. (Conference or Workshop Paper published in Proceedings)
The advent of social markup languages and lightweight public data access methods has created an opportunity to share the social, documentary and system information locked in most servers as a mashup. Whereas solutions already exists for creating and managing mashups from network sources, we propose here a mashup framework whose primary information sources are the applications and user files of a server. This enables us to use server legacy data sources that are already maintained as part of basic administration to semantically link user documents and accounts using social web constructs. |
|
Bruno Staffelbach, Ökonomie ist mehr als nur Franken zählen, In: Neue Zürcher Zeitung, 94, p. 75, 24 April 2010. (Newspaper Article)
|
|
David Hausheer, Investigating the Economic Feasibility of Bandwidth-on-Demand Services for the European Research Networks, In: 3rd IFIP/IEEE International Workshop on Bandwidth on Demand and Federation Economics (BoD 2010). 2010. (Conference Presentation)
|
|
Cristian Morariu, SCRIPT: A Framework for Scalable Real-time IP Flow Record Analysis, In: 12th IEEE/IFIP Network Operations and Management Symposium (NOMS 2010). 2010. (Conference Presentation)
|
|
K Wang, L Li, D Hausheer, Z Liu, W Li, D Shi, G He, Burkhard Stiller, A trust-incentive-based combinatorial double auction algorithm, In: IEEE/IFIP Network Operations and Management Symposium (NOMS 2010), Institute of Electrical and Electronics Engineers, Osaka, Japan, 2010-04-19. (Conference or Workshop Paper published in Proceedings)
Resource allocations determine an important management task for operational Grids and networks, especially under the constraint of commercially offered resources. Therefore, the need for an optimal allocation of this task arises, and this paper proposes a trust-incentive-based combinatorial double auction algorithm for these resource allocations in Grids. The key and new contribution is the design of a trust-incentive mechanism, which is integrated into an existing combinatorial double auction algorithm (a) to improve the performance of Grid resource allocation and (b) ensure that trust values of participating bidders (typically Grid users, termed peers) are considered. In the newly developed trust-incentive-based algorithm, each peers' trust value is adopted to adjust their bids in the process of the combinatorial double auction. After each transaction, peers participating in the transaction rate each other to setup and update the bilateral trust relationship. Those simulation results obtained demonstrate that the algorithm proposed can improve the efficiency of resource sharing greatly by providing applicable incentives to trustworthy peers to contribute more resources. Moreover, this algorithm can identify and eliminate malicious peers in the system to enhance the Grid security level in that respect. |
|
Peter Racz, Daniel Dönni, Burkhard Stiller, An architecture and implementation for IP network and service quality measurements, In: 2010 IEEE Network Operations and Management Symposium - NOMS 2010, Institute of Electrical and Electronics Engineers, Osaka, Japan, 2010-04-19. (Conference or Workshop Paper published in Proceedings)
Network and service performance measurements are essential in IF (Internet Protocol) networks, e.g., for network management, network monitoring, and service quality assurance. In order to measure the service quality received by individual users, a service-specific measurement system is required. Therefore, this paper develops and implements the Network and Service Quality Measurement (NSQM) architecture, which integrates network- and service-specific measurements and can configure all measurement components on demand and according to service signaling in order to setup network-wide measurements in an automated manner. NSQM supports the correlation of measurement data from multiple locations, enabling the determination of link-specific and end-to-end performance characteristics. NSQM allows for measurements at different aggregation levels, including service-class and per-flow measurements. It integrates both active and passive measurements and supports a fine-grained selection of traffic to be measured, which reduces the amount of measurement data to be collected and processed. |
|
Cristian Morariu, Peter Racz, Burkhard Stiller, SCRIPT: A framework for scalable real-time IP flow record analysis, In: 2010 IEEE Network Operations and Management Symposium - NOMS 2010, Institute of Electrical and Electronics Engineers, Osaka, Japan, 2010-04-19. (Conference or Workshop Paper published in Proceedings)
Analysis of IP traffic is highly important, since it determines the starting point of many network management operations, such as intrusion detection, network planning, network monitoring, or accounting and billing. One of the most utilized metering data formats in analysis applications are IP (Internet Protocol) flow records. With the increase of IP traffic, such traffic analysis applications need to cope with a constantly increasing number of flow records. Typically, centralized approaches to IP traffic analysis have scalability problems, which are addressed by replacing existing hardware with more powerful CPUs and faster memory. In contrast, this paper developed and implemented SCRIPT (Scalable Real-time IP Flow Record Analysis), which defines a scalable analysis framework that can be used to distribute flow records to multiple nodes performing traffic analysis in order to balance the overall workload among those nodes. Due to its generic design, the framework developed can be extended and used to distribute other metering data, such as packet headers, payloads, or accounting records. |
|
Dan Moser, Enriique Cano, Peter Racz, Secure large file transfer over multiple network paths, In: 2010 IEEE Network Operations and Management Symposium - NOMS 2010, Institute of Electrical and Electronics Engineers, Osaka, Japan, 2010-04-19. (Conference or Workshop Paper published in Proceedings)
The transfer of very large files often faces the problem of performance degradation due to bottlenecks or congestions in a network. Large file transfer is typical in Grid networks [11], where multiple nodes cooperate and run a common application. To improve the performance of large file transfer and to reduce the transfer time, this paper proposes a new file transfer application, called Secure Large File Transfer (SLFT) that supports file transfer over multiple independent network paths. For this purpose, SLFT uses Grid nodes as relays in order to route traffic to the destination. The described problem may also be solved on a lower layer of the OSI model; however, given the high heterogeneity of Grid environments, implementing the required mechanisms on layer 2 or 3 might not be feasible due to the inherent differences between the i volved ISP domains. Taking the approach to the application layer, makes possible to create a generic mechanism able to operate over the different possible underlying communication infrastructure [12]. The SLFT application has been implemented as a part of the GINTONIC toolbox, a set of Grid-specific network enhancements developed under the EC-GIN project [5]. In Grid networks nodes are geographically distributed, several administrative domains may be involved, and the communication is in a potentially hostile environment, the transfer of files involves several security threats which are addressed by the security architecture design of the SLFT application. This paper presents the design and implementation details of the SLFT application, focusing on the necessary security features. Based on practical experiences and evaluation in a test-bed, the performance of the SLFT application has been assessed. Evaluation results show that SLFT can improve the performance thanks to parallel transfer over multiple network paths. |
|
A Hanemann, D Hausheer, P Reichl, Burkhard Stiller, P van Daalen, Investigating the economic feasibility of bandwidth-on-demand services for the European research networks, In: 2010 IEEE Network Operations and Management Symposium - NOMS 2010, Institute of Electrical and Electronics Engineers, Osaka, Japan, 2010-04-19. (Conference or Workshop Paper published in Proceedings)
In the recent years, several technical solutions have been developed that allow Bandwidth-on-Demand services to be offered by service providers. In the context of the European academic networks, the technical feasibility of such services has been demonstrated on several occasions. The move from a research activity to a production service presents further requirements in addition to the overcoming of technical challenges. Most notably, apart from organization of day-to-day operations, financial issues have to be solved. These relate to sharing the costs of a service that is delivered jointly by independent organizations. The main cost factors are operations staff as well as network resources that are purchased for on-demand use. Moreover, the financial issues also include pricing models for use of the service use as well as the distribution of the revenues among the participating organizations. The present paper considers this fundamental BoD scenario together with key challenges in detail, before discussing the general approach for specifying a cost/pricing model. |
|