V Siris, I Ganchev, M O'Droma, Burkhard Stiller, Services, Optimization, and Economic Aspects, In: Traffic and QoS Management in Wireless Multimedia Networks, Springer, New York, USA, p. 267 - 303, 2009-05-01. (Book Chapter)
Future broadband wireless services will involve a range of different technologies, all with varying characteristics.These differences can influence significantly how a service is defined and deployed as well as how it is commercially offered to customers. Key technologies and services that will influence and shape the future broadband telecommunication marketbroadband telecommunication market include WiFi services WiFi services based on IEEE 802.11 standards, enhanced Universal Mobile Telecommunication System (UMTS) services, mobile grid services mobile grid services , incoming call handling services, and finally alternative approaches to broadband access such as high aerial platforms high aerial platforms. Each of these technologies and services has unique requirements in terms of deployment and management. |
|
Towards the Future Internet - A European Research Perspective, Edited by: Georgios Tselentis, John Domingue, Alex Galis, Anastasius Gavras, David Hausheer, Srdjan Krco, Volkmar Lotz, Theodore Zahariadis, I O S Press, Amsterdam, 2009-05-01. (Edited Scientific Work)
The Internet is a remarkable catalyst for creativity, collaboration and innovation, providing us today with amazing possibilities that just two decades ago would have been impossible to imagine. Our challenge today is to prepare a trip into the future: what will be the Internet in ten or twenty years from now and what more amazing things will it offer to people? In order to see what the future will bring, we first need to consider some important challenges that the Internet faces today. European scientists proved that they are at the forefront of Internet research already since the invention of the web. But the challenges are huge and complex and cannot be dealt with in isolation. The European Future Internet Assembly is the vehicle to a fruitful scientific dialogue, bringing together the different scientific disciplines that contribute to the Future Internet development. Until now, scientists from more than 90 research projects were funded with around 300 million euros under the 7th Framework Programme. Another 400 million euros will be made available in the near future. These amounts coupled with private investments bring the total investment to more than a billion euros, showing Europe’s commitment to address the challenges of the future Internet. This book is a peer-reviewed collection of scientific papers addressing some of the challenges ahead that will shape the Internet of the Future. The selected papers are representative of the research carried out by EU-funded projects in the field. European scientists are working hard to make the journey to the Future Internet as exciting and as fruitful as was the trip that brought us the amazing achievements of today. We invite you to read their visions and join them in their effort so Europe can fully benefit from the exciting opportunities in front of us. |
|
S Eilemann, Maxim Makhinya, Renato Pajarola, Equalizer: A scalable parallel rendering framework, IEEE Transactions on Visualization and Computer Graphics, Vol. 15 (3), 2009. (Journal Article)
Continuing improvements in CPU and GPU performances as well as increasing multi-core processor and cluster-based parallelism demand for flexible and scalable parallel rendering solutions that can exploit multipipe hardware accelerated graphics. In fact, to achieve interactive visualization, scalable rendering systems are essential to cope with the rapid growth of data sets. However, parallel rendering systems are non-trivial to develop and often only application specific implementations have been proposed. The task of developing a scalable parallel rendering framework is even more difficult if it should be generic to support various types of data and visualization applications, and at the same time work efficiently on a cluster with distributed graphics cards. In this paper we introduce a novel system called Equalizer, a toolkit for scalable parallel rendering based on OpenGL which provides an application programming interface (API) to develop scalable graphics applications for a wide range of systems ranging from large distributed visualization clusters and multi-processor multipipe graphics systems to single-processor single-pipe desktop machines. We describe the system architecture, the basic API, discuss its advantages over previous approaches, present example configurations and usage scenarios as well as scalability results. |
|
P Diaz-Gutierrez, Jonas Bösch, Renato Pajarola, M Gopi, Streaming surface sampling using Gaussian ε-nets, The Visual Computer, Vol. 25 (5-7), 2009. (Journal Article)
We propose a robust, feature preserving and user-steerable mesh sampling algorithm, based on the one-to-many mapping of a regular sampling of the Gaussian sphere onto a given manifold surface. Most of the operations are local, and no global information is maintained. For this reason, our algorithm is amenable to a parallel or streaming implementation and is most suitable in situations when it is not possible to hold all the input data in memory at the same time. Using ε-nets, we analyze the sampling method and propose solutions to avoid shortcomings inherent to all localized sampling methods. Further, as a byproduct of our sampling algorithm, a shape approximation is produced. Finally, we demonstrate a streaming implementation that handles large meshes with a small memory footprint. |
|
J Ekanayake, Jonas Tappolet, H C Gall, Abraham Bernstein, Tracking concept drift of software projects using defect prediction quality, In: 6th IEEE Working Conference on Mining Software Repositories, 2009-05. (Conference or Workshop Paper published in Proceedings)
Defect prediction is an important task in the mining of software repositories, but the quality of predictions varies
strongly within and across software projects. In this paper
we investigate the reasons why the prediction quality is so
fluctuating due to the altering nature of the bug (or defect) fixing process. Therefore, we adopt the notion of a concept drift, which denotes that the defect prediction model has become unsuitable as set of influencing features has changed – usually due to a change in the underlying bug generation process (i.e., the concept). We explore four open source projects (Eclipse, OpenOffice, Netbeans and Mozilla) and construct file-level and project-level features for each of them from their respective CVS and Bugzilla repositories.
We then use this data to build defect prediction models and
visualize the prediction quality along the time axis. These
visualizations allow us to identify concept drifts and – as a consequence – phases of stability and instability expressed in the level of defect prediction quality. Further, we identify those project features, which are influencing the defect prediction quality using both a tree induction-algorithm and a linear regression model. Our experiments uncover that software systems are subject to considerable concept drifts in their evolution history. Specifically, we observe that the change in number of authors editing a file and the number of defects fixed by them contribute to a project’s concept drift and therefore influence the defect prediction quality.
Our findings suggest that project managers using defect
prediction models for decision making should be aware of
the actual phase of stability or instability due to a potential concept drift. |
|
S Janssen, C Pfeifer, Betriebsinterne Arbeitsmärkte, Hierarchie und Neueinstellungen: Eine empirische Untersuchung mit Personaldaten, Zeitschrift für Betriebswirtschaft, Vol. 79 (5), 2009. (Journal Article)
|
|
D Brunner, Die Wahrnehmung der Lohndisparität im Unternehmen und deren Wirkung auf die Kündigungsabsicht, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2009. (Dissertation)
|
|
H E M den Ouden, K J Friston, N D Daw, A R McIntosh, Klaas Enno Stephan, A dual role for prediction error in associative learning, Cerebral Cortex, Vol. 19 (5), 2009. (Journal Article)
Confronted with a rich sensory environment, the brain must learn statistical regularities across sensory domains to construct causal models of the world. Here, we used functional magnetic resonance imaging and dynamic causal modelling (DCM) to furnish neurophysiological evidence that statistical associations are learnt, even when task-irrelevant. Subjects performed an audio-visual target detection task while being exposed to distractor stimuli. Unknown to them, auditory distractors predicted the presence or absence of subsequent visual distractors. We modelled incidental learning of these associations using a Rescorla-Wagner (RW) model. Activity in primary visual cortex and putamen reflected learning-dependent surprise: these areas responded progressively more to unpredicted, and progressively less to predicted, visual stimuli. Critically, this prediction-error response was observed even when the absence of a visual stimulus was surprising. We investigated the underlying mechanism by embedding the RW model into a DCM to show that
auditory-to-visual connectivity changed significantly over time as a function of prediction error. Thus, consistent with predictive coding models of perception, associative learning is mediated by prediction-error dependent changes in connectivity. These results posit a dual role for prediction-error in encoding surprise and driving associative plasticity. |
|
Björn Bartling, Ernst Fehr, Michel Maréchal, Daniel Schunk, Egalitarianism and competitiveness, American Economic Review, Vol. 99 (2), 2009. (Journal Article)
The article discusses and analyzes data from several economic experiments in a household survey with mothers of preschool children. The researchers measured competitiveness by giving the subjects the choice between competing in a tournament or receiving a piece rate for a real effort task. The subjects also participated in lottery choices, which enabled the researchers to assess their risk preferences. The relationship between social preferences and competitiveness in the sample of mothers of preschool children was analyzed. The hypothesis that egalitarian subjects aren't as likely to self-select into competitive environments, which can produce winners and losers, was tested. A negative relationship between egalitarian choices and self-selection into competition was found. |
|
Fikret Crnisanin, Konzeptualisierung, Implementierung und Evaluation einer Tagging-Metapher zur Webseiten-Navigation, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2009. (Master's Thesis)
Tag clouds are visual representations of metadata which is composed of a set of words or ìtagsî. For retrieval purposes users assign tags to objects. Their ability to glance over an information repository makes tag clouds suitable for navigation therein. To give evidence to this assumption, this study conducts an initial user survey which gives rise to a new concept for tag cloud navigation. A prototype is built following this conceptual work. The result is a novel tag cloud which shows tags as word lists, separated by navigation categories. An evaluation conducted with 16 users shows that information gets retrieved significantly faster by using less clicks. Application areas where quick information retrieval is crucial could benefit from these results. |
|
Stefan Huber, Dynamische Rollen- und Rechteverwaltung für die PM-Plattform, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2009. (Bachelor's Thesis)
In some past diploma thesis a PM-platform was developed for the support of the project
mioBook from the research are Man | Informatics | Organization (MIO) at the Department of
Informatics at the University of Zurich. The PM-platform is based on the extreme
programming paradigm.
This bachelor thesis works on the extension of the existing platform and reengineers the
dynamic administration of roles and user privileges under the principles of Plone and Zope.
The main focus here was the collaborative writing module which had enormous problems
with the administration of roles and user privileges.
Alongside, the PM-platform was ported on MacOSX and provided the Department of
Informatics at the University of Zurich as test environment. |
|
Stefan Amstein, Evaluation und Evolution von Pattern-Matching-Algorithmen zur Betrugserkennung, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2009. (Master's Thesis)
Fraud detection often involves the analysis of large data sets originating from private companies or governmental agencies by means of artificial intelligence (such as data mining) but there are also pattern-matching approaches.
ChainFinder, an algorithm for graph-based pattern matching, is capable of detecting transaction chains within financial data that could indicate fraudulent behavior. In this work, relevant measurements of correctness and performance are acquired in order to evaluate and evolve the given implementation of the ChainFinder. A series of tests, both on synthetic and more realistic datasets are conduced and their results discussed. Along with this process, a number of derivative ChainFinder implementations emerged and are compared to each other.
Throughout this process, an evaluation framework application was developed in order to assist the evaluation of similar algorithms by providing certain automatisms.
|
|
Monir Mahdavi, Implementierung eines automatischen Testcenters mit Canoo Webtest, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2009. (Master's Thesis)
Web applications are being developed after XP methodology demand more comprehensive testing which are increasingly complex. These complexities have proved to increase test cost and have been error prone. Web application users demand high quality factors such as zero defect and high availability. Integrating a test automation mechanism for testing the software under development provides support for the reduction of costly test cycles and reworks. Integrating a monitoring system provides a solution to continuously availability of the web application. This thesis concentrates on development of a test center utilizing Canoo Webtest for test automation of graphical user interfaces of 360? Feedback web application. It describes test strategies and techniques that are applied in order to coverage toward required test exit criteria. Also, thesis explores methodologies that are used for the synchronization of application and test software which are already applied in the test center. Integrating a monitoring system utilizing Nagios as monitoring engine and its plugin called WebInject software provides a complete solution for high availability of the production instance of 360? Feedback application.
|
|
David Autor, David Dorn, This job is “Getting old”: Measuring changes in job opportunities using occupational age structure, American Economic Review, Vol. 99 (2), 2009. (Journal Article)
|
|
Andrej Taliun, Michael Hanspeter Böhlen, Arturas Mazeika, CORE: Nonparametric Clustering of Large Numeric Databases, In: SDM 2009: Proceedings of the SIAM International Conference on Data Mining, SIAM (Society for Industrial and Applied Mathematics), 2009-04-30. (Conference or Workshop Paper published in Proceedings)
Current clustering techniques are able to identify arbitrarily shaped clusters in the presence of noise, but depend on carefully chosen model parameters. The choice of model parameters is difficult: it depends on the data and the clustering technique at hand, and finding good model parameters often requires time consuming human interaction. In this paper we propose CORE, a new nonparametric clustering technique that explicitly computes the local maxima of the density and represents them with cores. CORE proposes an adaptive grid and gradients to define and compute the cores of clusters. The incrementally constructed adaptive grid and the gradients make the identification of cores robust, scalable, and independent of small density fluctuations. Our experimental studies show that CORE without any carefully chosen model parameters produces better quality clustering than related techniques and is efficient for large datasets. |
|
Cristian Morariu, DiCAP - An Architecture for Distributed Packet Capturing, In: COST Econ@Tel WG4 Meeting: Risk and Security Management. 2009. (Conference Presentation)
|
|
Burkhard Stiller, Network Management: Role of Risk and Security Management, In: COST Econ@Tel WG4 Meeting: Risk and Security Management. 2009. (Conference Presentation)
|
|
Fabio Victora Hecht, Improving the Performance of P2P Streaming through an Overlay-operator Interface like ALTO/SIS, In: 2nd EMANICS Workshop on Peer-to-Peer Management. 2009. (Conference Presentation)
|
|
Andrei Aurel Vancea, Answering Queries Using Cooperative Semantic Caching, In: 2nd EMANICS Workshop on Peer-to-Peer Management. 2009. (Conference Presentation)
|
|
P Mahler, Feilen am Image als Arbeitgeber, In: Neue Zürcher Zeitung, 95, p. 73, 25 April 2009. (Newspaper Article)
|
|