Manfred Klenner, Stefanos Petrakis, Angela Fahrni, A Tool for Polarity Classification of Human Affect from Panel Group Texts, In: International Conference on Affective Computing & Intelligent Interaction, Amsterdam, The Netherlands, Sep 2009. (Conference or Workshop Paper)
|
|
Bettina Bauer-Messmer, Lukas Wotruba, Kalin Müller, Sandro Bischof, Rolf Grütter, Thomas Scharrenbach, Rolf Meile, Martin Hägeli, Jürg Schenker, The Data Centre Nature and Landscape (DNL): Service Oriented Architecture, Metadata Standards and Semantic Technologies in an Environmental Information System, In: EnviroInfo 2009: Environmental Informatics and Industrial Environmental Protection: Concepts, Methods and Tools, Shaker Verlag, Aachen, Aachen, 2009-09-01. (Conference or Workshop Paper published in Proceedings)
|
|
Linard Moll, Anti Money Laundering under real world conditions - Finding relevant patterns, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2009. (Master's Thesis)
This Master Thesis deals with the search for new patterns to enhance the discovery of fraudulent activities within the jurisdiction of a financial institution. Therefore transactional data from a database is analyzed, scored and processed for the later usage by an internal anti-money laundering specialist. The findings are again stored in a database and processed by TV - the Transaction Visualizer, an existing and already commercially used tool. As a result of this thesis, the software module TMatch and the graphical user interface TMatchViz were developed. The interaction of these two tools was tested and evaluated using synthetically created datasets. Furthermore, the approximations made and their impact on the specification of the algorithms will be addressed in his report. |
|
Kevin Leopold, Implementation of PSH Reputation Mechanism in LiveShift Application, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2009. (Master's Thesis)
Raising demand for online video streaming establishes new challenges for content providers to allocate the necessary bandwidth. P2P networks are an adequate solution for this, capable of distributing the load from central content providers among the actual consumers. LiveShift is a P2P video streaming application developed by the CSG group at the University of Zurich that supports both live streaming and time shifting in a distributed and scalable manner. It relies on peers donating their uplink capacity in order to function properly. Therefore, incentives must be in place to obviate selsh behavior among the peers. In this thesis the integration of incentives into the LiveShift application is designed, implemented and evaluated. A new prototype of LiveShift is proposed that gives the option to use either no incentive mechanism, a tit-for-tat (TFT) based mechanism or a private and shared history-based (PSH) mechanism. It oers a fair quality of experience (QoE) to its users, trying to punish selsh behavior with lower perceived quality. It may thereby give an incentive for the participating peers to act generously and thereby increase the overall performance of the application. |
|
Anthony Lymer, Ein Empfehlungsdienst für kulturelle Präferenzen in adaptiven Benutzerschnittstellen, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2009. (Master's Thesis)
This thesis addresses the refinement of adaptation rules in a web-based to-do management system named MOCCA. MOCCA is an adaptive system, which adapts the user interface using the cultural background information of each user. To achieve the goal of this thesis, a recommender system was developed, which clusters similar users into groups. In order to create new adaptation rules for similar users, the system calculates recommendations, which are assigned to the groups. The recommender system uses techniques such as collaborative filtering, k-Means and the statistical X2 goodness-of-fit test. The system was designed in a modular fashion and divided into two parts. One part of the recommender system gathers similar users and groups them accordingly. The other part uses the generated groups and calculates recommendations. For each part two concrete components were created. Those components are interchangeable, so that the recommender system can be composed as desired. All possible compositions were evaluated with a set of test users. It could be shown, that the developed recommender system generates a more accurate user interface than the initially given adaptation rules. |
|
Marco Kessler, Analysis of the Dynamics of a GRN-based Evo-Devo System, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2009. (Master's Thesis)
A multitude of evo-devo system have been designed to investigate the behavior of embodied agents in the form of software simulations. In this thesis, a model based on genetic regulatory networks (GRNs) and inspired by the concepts of developmental biology is proposed. The model was designed to offer the virtual organism unprecedented flexibility in regard to developmental processes and cell features. The development of virtual organisms, embodied in physically aware environments, was investigated with the aid of several tasks that the agents had to perform. An advanced evolutionary algorithm was employed to discover virtual organisms that developed ingenious solutions to the tasks of, for example, reaching a predefined location in the virtual world or travel through it. The underlying internal dynamics of successful individual simulations were studied with the goal of better understanding the emergent behavior. On occasion, the genetic regulatory network of hand-picked simulations was subjected to the silencing of specific genes in order to investigate the effect that the artificial lesioning of selected cellular features or processes had on the behavior and performance. A discussion on the issue of complexity matching as intended by the need for a balance between the increased flexibility of the cell model and that of the environment closes the thesis. |
|
Dustin Wüest, Implementation of EvoSpaces 2 in Java, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2009. (Master's Thesis)
In this thesis we implement a 3D visualization for software systems. We use a 3D city metaphor and display the source code entities as city objects. This allows users to intuitively understand what they see. The tool can be used for the analysis of complex software systems. Our prototype of the tool helps in understanding, maintenance and reverse engineering of large systems. One of our main goals is to outperform currently available software visualizations in performance and usability. We use the powerful jMonkeyEngine for the rendering of the 3D view. Our tool is written in Java. The current architecture of our tool was developed keeping extensibility in mind. Test persons evaluate our tool. The results show that the testers are pleased with the 3D visualization. The strengths of our tool lie in its abilities to give an overview of a software system and that it allows to find code smells easily. The use of a game engine results in a smooth 3D view. |
|
Stefan Christiani, Creation: A Framework for Artificial Evolution Using Complexity Matching, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2009. (Master's Thesis)
One of the many issues in the domain of artificial evolution revolves around the existence of designer bias. To address this particular issue, the author presents a framework in which the evolutionary component is voided of most problem related assumptions so as not to allow them to interfere. Such a goal however implies that for optimal efficiency a match between the unknown problem and any solutions proposed by evolutionary algorithms must be found. Three approaches dealing with this complexity matching issue and based on a novel genetic regulatory network model are presented, thoroughly tested and form the centrepiece of the framework. Additional parameter adaptation approaches further tune some other evolutionary parameters such as the mutation rate or the genome size through simple heuristics and observations of the fitness values. Built on these techniques the framework is able to cope with problems ranging from very simple up to fairly complex ones. |
|
Philipp Mathys, Towards the physics based modelling of a compliantly engineered humanoid robot and its interactions and a feasibility study, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2009. (Master's Thesis)
The ECCEROBOT project is an effort to build and control an anthropomimetic robot which contains a muscule-tendon like actuator system embedded in a skeleton. This diploma thesis has been written in the context of the ECCEROBOT project and its goal was to simulate the working of a muscle as accurately as possible. To build the simulation, the first step was to select a suitable simulation tool. A survey of the available physics based simulation tools was undertaken to select the most suitable open-source tool with respect to accuracy, stability and speed of calculation for simulations of the robot. Finally, JBullet, the Java version of the Bullet physics engine, was selected in combination with OpenGL to simulate an arm of the robot. The simulation of the arm muscle was done based on the Hill model. Various combinations of parameters were investigated for the control of the muscle and also what actions the muscle could perform most effectively in simulation. The report contains the findings related with the accuracy of the simulated arm and its performance in interaction with other objects.
|
|
Daniel Wechsler, Altruism as a stepping stone to the evolution of cooperation, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2009. (Bachelor's Thesis)
In the last decades great progess was made to explain the evolution of cooperation. Although important strides were made in explaining why natural selection can favor cooperators over defectors, still many questions remain unanswered. This work is about the evolution of cooperation and is based on an agent simulation proposed by Mikahil Burtsev & Peter Turchin to investigate the evolution of cooperative strategies among simple agents. The model comprises of simple agents that live on cells that are arranged as a 2-dimensional grid. The agents can perform certain actions like moving, eating, dividing and ghting. They are controlled by a simple neural network that connects the inputs, that provide the agent with information about the outside world and its internal state (resource detection, agent detection, internal energy and so on), to the actions. The neural network is subject to mutation whenever an agent divides into osprings. I made some extensions to this original model among others the introduction of a give action that allows agents to transfer energy between each other and the addition of a reputation value that indicates whether an agent is cooperative (giving) or defective (attacking). The results of the simulations of this extended model show that the system actually can converge to a state where the agents use strategies that frequently make use of the give action. Analysis of the interaction network reveals that the agents organize in clusters of cooperation, where they mutually exchange energy. |
|
Monika Seps, Jose Gonzalez, Alejandro Hernandez Arieta, Konstantinos Dermitzakis, Rolf Pfeifer, Mastering the Man-Machine Communication: Sensory Feedback for the Perceptual Embodiment of a Neuroprosthesis, September 2009. (Other Publication)
|
|
Natalia Estevez, Ambra Sposito, Valerie Bugmann, Hepp-Reymond Marie-Claude, Roger Gassert, Peter Brugger, Sabina Hotz, Angelo Maravita, Spiros Kollias, Alejandro Hernandez Arieta, Rolf Pfeifer, Can a robot-hand take the place of a missing hand? A project with upper limb amputees., September 2009. (Other Publication)
|
|
Cédric Jeanneret, Finding the right level of abstraction, In: Doctoral Symposium of the 17th IEEE International Requirements Engineering Conference (RE '09), 2009-08-31. (Conference or Workshop Paper published in Proceedings)
Today, modelers must rely on their instinct and experience to decide how much detail of a system is worth being modeled. This ad-hoc modeling may result in models that are either too abstract or too detailed for their intended use. We propose to investigate objective measurement of a model’s abstractness and systematic guidance to attain the right level of abstraction. Such a contribution has the potential to improve the quality of models and reduce the amount of time needed for modeling activities. |
|
Thorsten Hens, Bankensystem ohne Banken, In: Finanz und Wirtschaft, p. 1, 26 August 2009. (Newspaper Article)
|
|
Thomas Zimmermann, Nachiappan Nagappan, Harald Gall, Emanuel Giger, Brendan Murphy, Cross-project defect prediction: a large scale experiment on data vs. domain vs. process, In: 7th Joint Meeting of the European Software Engineering Conference and the ACM SIGSOFT Symposium on the Foundations of Software Engineering, ACM, New York, NY, USA, 2009-08-24. (Conference or Workshop Paper published in Proceedings)
Prediction of software defects works well within projects as long as there is a sufficient amount of data available to train any models. However, this is rarely the case for new software projects and for many companies. So far, only a few have studies focused on transferring prediction models from one project to another. In this paper, we study cross-project defect prediction models on a large scale. For 12 real-world applications, we ran 622 cross-project predictions. Our results indicate that cross-project prediction is a serious challenge, i.e., simply using models from projects in the same domain or with the same process does not lead to accurate predictions. To help software engineers choose models wisely, we identified factors that do influence the success of cross-project predictions. We also derived decision trees that can provide early estimates for precision, recall, and accuracy before a prediction is attempted. |
|
Patrick Knab, Martin Pinzger, Harald Gall, Smart views for analyzing problem reports: tool demo, In: 7th joint meeting of the European Software Engineering Conference and the ACM SIGSOFT Symposium on the Foundations of Software Engineering, ACM, 2009-08-24. (Conference or Workshop Paper)
Issue tracking repositories contain a wealth of information for reasoning about various aspects of software development processes. In this paper, we focus on bug triaging and provide visual means to explore the effort estimation quality and the bug life-cycle of reported problems.
Our approach uses a combination of graphical views to investigate details of individual problem reports while maintaining the context provided by the surrounding data population. This enables the detection and detailed analysis of hidden patterns and facilitates the analysis of problem report outliers. |
|
J Novak, Susanne Schmidt-Rauch, When joy matters: the importance of hedonic stimulation in collocated collaboration with large-displays, In: INTERACT 2009, Springer, Berlin, 2009-08-24. (Conference or Workshop Paper published in Proceedings)
Hedonic aspects are increasingly considered as an important factor in
user acceptance of information systems, especially for activities with high
self-fulfilling value for the users. In this paper we report on the results of an experiment
investigating the hedonic qualities of an interactive large-display
workspace for collocated collaboration in sales-oriented travel advisory. The results
show a higher hedonic stimulation quality of a touch-based large-display
travel advisory workspace than that of a traditional workspace with catalogues.
Together with the feedback of both customers and travel agents this suggests
the adequacy of using touch-based large-displays with visual workspaces for
supporting the hedonic stimulation of user experience in collocated collaboration
settings. The relation of high perception of hedonic quality to positive emotional
attitudes towards the use of a large-display workspace indicates that even
in utilitarian activities (e.g. reaching sales goals for travel agents) hedonic
aspects can play an important role. This calls for reconsidering the traditional
divide of hedonic vs. utilitarian systems in current literature, to a more balanced
view towards systems which provide both utilitarian and hedonic sources of
value to the user. |
|
E Volk, M Mueller, A Jacob, P Racz, M Waldburger, Increasing Capacity Exploitation in Food Supply Chains Using Grid Concepts, In: The 6th International Workshop on Grid Economics and Business Models (Gecon2009), Springer, Delft, The Netherlands, 2009-08-24. (Conference or Workshop Paper published in Proceedings)
Food supply chains today are characterized by fixed trade relations with long term contracts established between heterogeneous supply chain companies. Production and logistics capacities of these companies are often utilized in an economically inefficient manner only. In addition, increased consumer awareness in food safety issues renders supply chain management even more challenging, since integrated tracking and tracing along the whole food supply chain is needed. Facing these issues of supply chain management complexity and completely documented product quality, this paper proposes a full lifecycle solution for dynamic capacity markets based on concepts used in the field of Grid [1], like management of Virtual Organization (VO) combined with Service Level Agreement (SLA). The solution enables the cost-efficient utilization of real world capacities (e.g., production capacities or logistics facilities) by using a simple, browser-based portal. Users are able to enter into product-specific negotiations with buyers and suppliers of a food supply chain, and to obtain real-time access to product information including SLA evaluation reports. Thus, business opportunities in wider market access, process innovation, and trustworthy food products are offered for participating supply chain companies. |
|
Ramazan Gençay, The Role of Signal Precision and Transaction Costs in Stock, Option and Volatility Trading , In: Annual Congress of the European Economic Association. 2009. (Conference Presentation)
In this study, we examine the rationale that informed traders use in choosing various financial instruments in order to speculate on the volatility of the underlying asset, here a common stock. Using a continuous-time trading model, we demonstrate that the quality of the private information regarding the volatility parameter together with the relative transaction costs observed in the various segments of the cash and derivatives markets will determine informed agents' trading habitats. We further show that in the presence of imprecise volatility signals, only the "most sophisticated" traders (those with highly precise volatility signals) will engage in pure volatility bets. Traders with less precise signals will choose a naked option strategy, while traders at the low spectrum of the precision scale will invest in the underlying stock. Thus, the low volume of pure volatility trades observed by Lakonishok et al. (2007) does not necessarily imply that only fringe traders have chosen to speculate on volatility. Rather, it may suggest that the majority of informed traders do not have precise volatility signals. |
|
P Mahler, Miliz-Engagement lohnt sich, In: Neue Zürcher Zeitung, 193, p. 63, 22 August 2009. (Newspaper Article)
|
|