Amelie Brune, An empirical study on the impact of microfinance institutions on development, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2009. (Bachelor's Thesis)
This paper examines the impact of microfinance institutions on development in an empirical setting, and therewith aims at filling a gap in econometric assessments of microfinance institutions. Using data of MFIs operating in selected African and Asian countries and choosing average savings and loan balances per client as proxies for development, there is empirical evidence for significant positive impact of microfinance institutions on development. Microcredit is the most robust mechanism to enhance development in recent years. While an MFI's size is mostly irrelevant, its experience was found to be especially enhancing for the amount of credit granted to the poor. Savings is found to be the best estimator for development in recent years, yet a structural break between 2003 and 2006 is possible. While African development is generally in arrears compared to Asia, there is no statistical evidence for differences in the marginal impact of microfinance institutions subject to geographical positions, which allows for the conclusion of environment independent positive impact of microfinance institutions on development in low-income countries. |
|
Florian Eugster, Does Value Reporting Pay Off?, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2009. (Master's Thesis)
|
|
Marc Chesney, Haben die Finanzmärkte den Kapitalismus verraten?, In: Der Schweizer Treuhänder, p. 506 - 510, 1 August 2009. (Newspaper Article)
|
|
D Hausheer, Burkhard Stiller, Scalable and Economic Management of the Future Internet, e & i Elektrotechnik und Informationstechnik, Vol. 126 (7-8), 2009. (Journal Article)
The Internet has evolved from an infrastructure supporting mainly e-mail and web applications, to a
generic platform for a broad range of services, including Internet telephony (VoIP), Internet television
(IPTV), Peer-to-Peer (P2P) file sharing, and many more. The proliferation of wireless and optical fiber
technology has lead to an increase both in terms of network coverage and capacity, which is likely going to
continue in the near future. As a consequence, the Internet is facing a tremendous growth in the number of
users and services, and the amount of traffic generated by them. However, the Internet architecture is still
based on its original design which is facing a number of shortcomings, including but not limited to lack of
scalability, mobility support, and security. In order to address these problems, possible solutions proposed
in the scope of global and EU initiatives on the “Future Internet” range from evolutionary to revolutionary
approaches. However, the future Internet will only be successful if economic aspects and requirements
of users, service providers, and network operators are taken into account. To this end, appropriate incentives
are needed for an economically efficient supply and use of network resources for different Internetbased
services. Furthermore, the relation between technology and these economic aspects — termed
techno-economics — are of critical importance for a successful new Internet.
This paper provides an overview on such different aspects, describes a basic model for the future Internet,
and discusses challenges of a scalable and economic management of the future Internet. In addition, it
presents and discusses selected areas of importance, including economic management of future Internet
traffic and services as well as traffic analysis in the future Internet. |
|
C Bird, A Bachmann, E Aune, J Duffy, Abraham Bernstein, V Filkov, P Devanbu, Fair and balanced? Bias in bug-fix datasets, In: ESEC/FSE '09: Proceedings of the 7th joint meeting of the European software engineering conference and the ACM SIGSOFT symposium on The foundations of software engineering on European software engineering conference and foundations of software engineering, 2009-08. (Conference or Workshop Paper published in Proceedings)
Software engineering researchers have long been interested in where and why bugs occur in code, and in predicting where they might turn up next. Historical bug-occurence data has been key to this research. Bug tracking systems, and code version histories, record when, how and by whom bugs were fixed; from these sources, datasets that relate file changes to bug fixes can be extracted. These historical datasets can be used to test hypotheses concerning processes of bug introduction, and also to build statistical bug prediction models. Unfortunately, processes and humans are imperfect, and only a fraction of bug fixes are actually labelled in source code version histories, and thus become available for study in the extracted datasets. The question naturally arises, are the bug fixes recorded in these historical datasets a fair representation of the full population of bug fixes? In this paper, we investigate historical data from several software projects, and find strong evidence of systematic bias. We then investigate the potential effects of "unfair, imbalanced" datasets on the performance of prediction techniques. We draw the lesson that bias is a critical problem that threatens both the effectiveness of processes that rely on biased datasets to build prediction models and the generalizability of hypotheses tested on biased data. |
|
Matej Hoffmann, Rolf Pfeifer, Let animats live!, Adaptive Behavior, Vol. 17 (4), 2009. (Journal Article)
Barbara Webb raises a crucial methodological point: in order to learn something about biology, it is more effective to build models of existing animals, rather than building ad hoc artificial creatures (animats). The critical problem with animats is how to validate them, since a direct comparison with animals is not straightforward. In our commentary, we will raise three main points: 1. While we agree that validation of an animat is indispensable, we propose alternative validation criteria to direct comparison with an animal. 2. We will advocate the animats’ right to existence by listing advantages of the animat approach. 3. We speculate that methodological purity versus effectiveness of practical research can be sometimes seen as a trade-off. |
|
D Frohberg, C Göth, Gerhard Schwabe, Mobile Learning Projects - a critical analysis of the state of the art, Journal of Computer Assisted Learning, Vol. 25 (4), 2009. (Journal Article)
A brief narrative description of the journal article, document, or resource. This paper provides a critical analysis of Mobile Learning projects published before the end of 2007. The review uses a Mobile Learning framework to evaluate and categorize 102 Mobile Learning projects, and to briefly introduce exemplary projects for each category. All projects were analysed with the criteria: context, tools, control, communication, subject and objective. Although a significant number of projects have ventured to incorporate the physical context into the learning experience, few projects include a socializing context. Tool support ranges from pure content delivery to content construction by the learners. Although few projects explicitly discuss the Mobile Learning control issues, one can find all approaches from pure teacher control to learner control. Despite the fact that mobile phones initially started as a communication device, communication and collaboration play a surprisingly small role in Mobile Learning projects. Most Mobile Learning projects support novices, although one might argue that the largest potential is supporting advanced learners. All results show the design space and reveal gaps in Mobile Learning research. |
|
A Bachmann, Abraham Bernstein, Software process data quality and characteristics - a historical view on open and closed source projects, In: IWPSE-Evol'09: Proceedings of the joint international and annual ERCIM workshops on Principles of software evolution (IWPSE) and software evolution (Evol) workshops, 2009-08. (Conference or Workshop Paper published in Proceedings)
Software process data gathered from bug tracking databases and version control system log files are a very valuable source to analyze the evolution and history of a project or predict its future. These data are used for instance to predict defects, gather insight into a project's life-cycle, and additional tasks. In this paper we survey five open source projects and one closed source project in order to provide a deeper insight into the quality and characteristics of these often-used process data. Specifically, we first define quality and characteristics measures, which allow us to compare the quality and characteristics of the data gathered for different projects. We then compute the measures and discuss the issues arising from these observation. We show that there are vast differences between the projects, particularly with respect to the quality in the link rate between bugs and commits. |
|
R M Füchslin, T Maeke, J S McCaskill, Spatially resolved simulations of membrane reactions and dynamics: Multipolar reaction DPD, European Physical Journal E - Soft Matter and Biological Physics, Vol. 29 (4), 2009. (Journal Article)
Biophysical chemistry of mesoscale systems and quantitative modeling in systems biology now
require a simulation methodology unifying chemical reaction kinetics with essential collective physics. This
will enable the study of the collective dynamics of complex chemical and structural systems in a spatially
resolved manner with a combinatorially complex variety of different system constituents. In order to allow
a direct link-up with experimental data (e.g. high-throughput fluorescence images) the simulations must
be constructed locally, i.e. mesoscale phenomena have to emerge from local composition and interactions
that can be extracted from experimental data. Under suitable conditions, the simulation of such local interactions
must lead to processes such as vesicle budding, transport of membrane-bounded compartments
and protein sorting, all of which result from a sophisticated interplay between chemical and mechanical
processes and require the link-up of different length scales. In this work, we show that introducing multipolar
interactions between particles in dissipative particle dynamics (DPD) leads to extended membrane
structures emerging in a self-organized manner and exhibiting the necessary mechanical stability for transport,
correct scaling behavior, and membrane fluidity so as to provide a two-dimensional self-organizing
dynamic reaction environment for kinetic studies in the context of cell biology. |
|
Marc Oliver Rieger, Optionen, Derivate und strukturierte Produkte - Ein Praxisbuch, Verlag Neue Zürcher Zeitung, Zürich, 2009-08-01. (Book/Research Monograph)
|
|
T Singer, H D Critchley, K Preuschoff, A common role of insula in feelings, empathy and uncertainty, Trends in Cognitive Sciences, Vol. 13 (8), 2009. (Journal Article)
Although accumulating evidence highlights a crucial role of the insular cortex in feelings, empathy and processing uncertainty in the context of decision making, neuroscientific models of affective learning and decision making have mostly focused on structures such as the amygdala and the striatum. Here, we propose a unifying model in which insula cortex supports different levels of representation of current and predictive states allowing for error-based learning of both feeling states and uncertainty. This information is then integrated in a general subjective feeling state which is modulated by individual preferences such as risk aversion and contextual appraisal. Such mechanisms could facilitate affective learning and regulation of body homeostasis, and could also guide decision making in complex and uncertain environments. |
|
D Houser, D Schunk, Social environments with competitive pressure: Gender effects in the decisions of German schoolchildren, Journal of Economic Psychology, Vol. 30 (4), 2009. (Journal Article)
Systematic differences in decision making between genders have been discovered in both competitive and pro-social environments. These contexts, however, have been previously studied in isolation while in naturally occurring settings pro-social and competitive pressures often overlap in economically meaningful ways. Here we report data from an experiment involving German schoolchildren where dictators are in one town and receivers in another. Our experiment informs decision making in social environments that include differing levels of competitive pressure. We find that competitive pressure significantly mitigates pro-sociality in boys, while it does not affect girls’ propensities to make fair decisions. This finding is robust to controlling for social and cognitive factors, and it may shed additional light on the evolutionary roots of human social preferences. |
|
Martin Barbie, A Kaul, The Zilcha criteria for dynamic inefficiency reconsidered, Economic Theory, Vol. 40 (2), 2009. (Journal Article)
|
|
D Schunk, J Winter, The relationship between risk attitudes and heuristics in search tasks: A laboratory experiment, Journal of Economic Behavior & Organization, Vol. 71 (2), 2009. (Journal Article)
Experimental studies of search behavior suggest that individuals stop searching earlier than the optimal, risk-neutral stopping rule predicts. Two different classes of
decision rules could generate this behavior: rules that are optimal conditional on utility functions departing from risk neutrality, or heuristics derived from limited cognitive processing capacities and satisficing. To discriminate between these possibilities, we conduct an experiment that consists of a search task as well as a lottery task designed to elicit utility functions. We find that search heuristics are not related to measures of risk aversion, but to measures of loss aversion. |
|
Rainer Winkelmann, Unemployment, social capital, and subjective well-being, Journal of Happiness Studies, Vol. 10 (4), 2009. (Journal Article)
It has been shown in past research that unemployment has a large negative impact on subjective well-being of individuals. In this paper, I explore whether and to what extent people with more social capital are sheltered from the harmful effects of unemployment. Using data from the German Socio-Economic Panel 1984–2004, I find that social capital is an important predictor of well-being levels, but there is no evidence that it moderates the effect of unemployment on well-being. Possible reasons for these findings are discussed, and suggestions for future research given. |
|
Tobias Bannwart, OMORE - Private, Personal Movie Recommendations implemented in a Mozilla Firefox Add-on, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2009. (Master's Thesis)
Online stores and Web portals bring information about a myriad of items such as books, CDs, restaurants or movies at the user's fingertips. Although, the Web reduces the barrier to the information, the user is overwhelmed by the number of available items. Therefore, recommender systems aim to guide the user to relevant items. Current recommender systems store user ratings on the server side. This way the scope of the recommendations is limited to this server only. In addition, the user entrusts the operator of the server with valuable information about his preferences.
In this thesis, we introduce our recommender system OMORE, a private, personal movie recommender, which learns the user model based on the user's movie ratings. To preserve privacy, OMORE is implemented as a Mozilla Firefox add-on, which stores the user's ratings and the learned user model locally at the client side. Although OMORE makes use of the movie features, which are provided by the different movie pages on the IMDb Web site, it is not restricted to IMDb only. The current implementation covers movie pages from Amazon.com, Blockbuster, Netflix and Rotten Tomatoes. |
|
Jef Van Loon, Refactorizer: Detecting Refactorings with Evolizer, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2009. (Master's Thesis)
Software systems have to evolve continually to meet their changing requirements. Refactorings are systematic, behavior-preserving changes to software systems intended to improve their inter- nal structure. Today, refactoring is considered a state of the art technique in software develop- ment.
Refactorings have a positive impact on the understandability and maintainability of a software system. Another important property is that their catalogued names capture the intent of the underlying changes and provide an accurate vocabulary for communicating changes at a much higher level of abstraction than the insertion or removal of individual lines of code.
A similar property has been attributed to the design patterns of Gamma et al.; giving mean- ingful names not only reveals the intent and structure of a pattern but also allows for efficient communication of design concepts. It is not only crucial to understand what changes have hap- pened but also what changes need to be made in order to maintain the understandability and extensibility of a software system. Therefore, another goal of refactoring research is to develop approaches for anticipating and suggesting possible refactorings to developers—which requires understanding past changes and refactorings.
Based on the EVOLIZER framework, which offers advanced modeling and data extraction fa- cilities, and FAMIX, a meta model for modeling the structure of object-oriented software systems, this thesis presents the approach, implementation and evaluation of REFACTORIZER, a prototype tool for detecting refactorings in the development history of a software system.
The main goals of the approach are to be intuitive and light-weight. This means that the ap- proach builds on the vocabulary and understanding of a software developer about how programs are composed and uses a simple representation of the structure of a software system to formulate intuitive FAMIX-based heuristics. |
|
Samuel Mezger, Untersuchung der Skalierbarkeit verschiedener Datenbankmanagementsysteme unter hohen Nutzerzahlen, August 2009. (Other Publication)
The work presented here describes measurements of transaction throughput for different database
management systems that focus on concurrency control. The measurements were taken for IBM
DB2 9.5, PostgreSQL 8.3 and Microsoft SQL Server 2008. During the measurements, the follow-
ing parameters were being changed to determine their effect on throughput: the isolation level,
the amount of memory available for the database’s buffer pool, the database’s cardinality and the
amount of operations per transaction. When trying to relate the measurements’ results to the expec-
tations based on theoretical principles, it is found that while some effects show as expected, many
phenomena have to be attributed to speci?c implementations of the different database management
systems. Expected results like lock thrashing and throughput-limitations due to I/O-performance
or the CPU’s processing speed are apparent. Unexpectedly, I/O-performance is a limiting factor
not only when small database buffer pools are used, and lock thrashing affects all database man-
agement systems in a different way. Furthermore, it is found that all of the used management
systems can reach higher throughput numbers at higher isolation levels. DB2 is noted to break con-
nections when the database’s buffer pool is chosen too large, while SQL Server does the same when
the database’s buffer pool is chosen too small. For PostgresSQL, transaction throughput is reduced
whenthe level of concurrency is increased. This happens due to the multi-versioning protocol used
by PostgreSQL, which leads to an increase in memory consumption under these conditions. |
|
Saran Thierwächter, Untersuchungen zu altersbedingten Unterschiede bezüglich Flowerlebnis beim Lernen am Computer, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2009. (Bachelor's Thesis)
This Paper examines the correlation between age and flow experience on learning activities at computers. The study includes the definition of the age groups, the analysis of their flow experience and the finding of group specific behavior patterns. In the experiment the test persons had to play a flash game. Based on this activity the flow and pulse values were recorded. The result was that all age groups achieved similar flow values. Nevertheless there were found differences between the age groups concerning the behavior patterns and the pulse values. Additionally a second Experiment with a smaller sample was performed. Therefore the brain waves and the flow values were recorded and compared. |
|
Jonas Minke, Developing a generic OpenGL Qt Viewer, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2009. (Bachelor's Thesis)
The aim of this work is the development of glooQtviewer, a generic 3D viewer based on OpenGL, a cross-platform application programming interface (API). This work also provides an overview on a few alternative frameworks that are presently available for such a project, and explains the criteria that were applied for selecting those used for this project. Furthermore, this work explores two special effects that can be achieved with OpenGL: shadowing and environment mapping. Finally, the glooQtviewer is discussed in detail and a few examples of rendering are presented. |
|