Jacek Ratzinger, Michael Fischer, Harald Gall, EvoLens: Lens-View Visualizations of Evolution Data, In: Proceedings of the 8th International Workshop on Principles of Software Evolution, 2005. (Conference or Workshop Paper)
Observing the evolution of very large software systems is difficult because of the sheer amount of information that needs to be analyzed and because the changes performed in the system are at a very low granularity level. In recent approaches software metrics have been used to compute condensed graphical visualizations of these data also reflecting metrics. However, most techniques concentrate on visualizing data of one particular release providing only insufficient support for visualizing data of several selected releases. In this paper we present the RelVis visualization approach that provides integrated condensed graphical views on source code and release history data of up to n releases of a software system. Measurements of metrics of n releases are composed to views that facilitate spectators to spot trends of metrics of source code entities and relationships. Critical trends are highlighted: This allows the user to direct perfective maintenance activities to source code entities involved. The paper provides needed background information and evaluation of the approach with a large open source software project. |
|
13th International Workshop on Program Comprehension, Edited by: Jonathan I. Maletic,, James R. Cordy, Harald Gall, St. Louis, Missouri, USA, 2005. (Proceedings)
|
|
Yong Xia, A Language Definition Method for Visual Specification Languages, Universität Zürich, 2005. (Dissertation)
Language definition is always an important topic in the field of computer science. For textual specification and programming languages, there are already lots of mature methods of language definition. However, for visual specification languages, especially the so called wide spectrum graphical modeling languages, the existing solutions are far from satisfactory.
ADORA is a modelling technique for requirements and software architecture that is being developed in the Requirement Engineering Research Group, University of Zurich. The acronym stands for Analysis and Description of Requirements and Architecture. ADORA and UML are typical examples of wide spectrum graphical modeling languages.
In this dissertation, we propose a new method of defining visual specification languages, which shall overcome the drawbacks of the existing definition methods. The ADORA Language, which is used to model system requirements and software architecture, is selected as a vehicle for demonstrating our method.
The following four aspects of language definition are particularly addressed:
· Syntax and static semantics definition
A text-based technique is given for the syntax and static semantics definition. We exploit the fact that in a visual specification language, most syntactic features are independent of the layout of the graph. So we map the graphical elements to textual ones and define the context-free syntax of this textual language in EBNF. Using our mapping, this grammar also defines the syntax of the graphical language. Simple spatial and context-sensitive constraints are then added by attributing the context-free grammar. Finally, for handling complex structural and dynamic information in the syntax, most of which is also called static semantics in the literature, we give a set of operational rules that work on the attributed EBNF.
We also extend this set of operational rules to specify an advanced feature of \textscAdora, which supports the description of partial and evolutionary specifications.
· Dynamic semantics definition
We propose a strategy of dynamic semantics definition for ADORA: instead of defining a formal semantics for the whole language at one time, which can be too complex to be understood and used, we define semantics first only for each simple sublanguage. Then an integration semantics, which specifies the interrelations and constraints among the sublanguages, is defined to keep the language self-consistent. Examples on the integration semantics are given to show how our strategy works.
Our solution is also valid for other wide spectrum graphical modeling languages.
· Refinement calculus
We define a formal refinement calculus for the structural view, behavioral view and user view of the ADORA language. This ensures that evolutionary specifications are written in a controlled and systematic way, while the consistency and integrity checking during the system refinement can be mechanically carried out by a tool based on our formal definition.
· The reciprocal influences between the language definition and language design
During the process of formal definition of ADORA, we also continue to design the language (e.g. the extension of ADORA for supporting partial and evolutionary specification). We conclude that language design and language definition are not two independent tasks. The study of the reciprocal influences between these two tasks improves the quality of the language definition. A set of principles derived from our development of ADORA is presented.
Our methods can also be used to define UML or any other wide-spectrum modeling languages in the near future. Compared with the existing works on the graphical modeling language definition, our language definition is more understandable, easier to apply and practically more useful. The achieved results on the refinement calculus, syntax and semantics definition are also valuable for the similar research.
At the end of this dissertation, limitations, open questions and future works are discussed. |
|
Philippe Tobler, Christopher D Fiorillo, Wolfram Schultz, Adaptive coding of reward value by dopamine neurons, Science, Vol. 307 (5715), 2005. (Journal Article)
It is important for animals to estimate the value of rewards as accurately as possible. Because the number of potential reward values is very large, it is necessary that the brain's limited resources be allocated so as to discriminate better among more likely reward outcomes at the expense of less likely outcomes. We found that midbrain dopamine neurons rapidly adapted to the information provided by reward-predicting stimuli. Responses shifted relative to the expected reward value, and the gain adjusted to the variance of reward value. In this way, dopamine neurons maintained their reward sensitivity over a large range of reward values. |
|
David Dorn, Alfonso Sousa-Poza, The determinants of early retirement in Switzerland, Swiss Journal of Economics and Statistics = Schweizerische Zeitschrift für Volkswirtschaft und Statistik, Vol. 141 (2), 2005. (Journal Article)
In the past decade, Switzerland has experienced a large increase in the number of individuals going into early retirement. This paper examines the determinants of such early retirement using data from the newly implemented social-security module of the 2002 Swiss Labor Force Survey. In the sixteen-month period from January 2001 to April 2002, more than 36,000 older individuals, representing 8% of all workers within nine years of legal retirement age, became early retirees. One of the most important determinants of early retirement is the wage rate, yet its effect is not linear: both high and low wages reduce the probability. Other factors that play an important role include partner's employment status, education, industry, occupation, and coverage in the three social-security pillars. A major finding of this study is that about 30% of all early retirees continue working after retirement - and mostly for the same employer. |
|
Yves Schneider, Peter Zweifel, How much internalization of nuclear risk through liability insurance?, Journal of Risk and Uncertainty, Vol. 29 (3), 2004. (Journal Article)
An important source of conflict surrounding nuclear energy is that with a very small probability, a large-scale nuclear accident may occur. One way to internalize the associated financial risks is through mandating nuclear operators to have liability insurance. This paper presents estimates of consumers' willingness to pay for increased financial security provided by an extension of coverage, based on the "stated choice" approach. A Swiss citizen with median characteristics may be willing to pay 0.14 US cents per kwh to increase coverage beyond the current CHF 0.7 billion (bn.) (USD 0.47 bn.). Marginal willingness to pay declines with higher coverage but exceeds marginal cost at least up to CHF 4 bn.(USD 2.7 bn.). An extension of nuclear liability insurance coverage therefore may be efficiency-enhancing. |
|
Hans Degryse, Steven Ongena, The Impact of Technology and Regulation on the Geographical Scope of Banking, Oxford Review of Economic Policy, Vol. 20 (4), 2004. (Journal Article)
We review how technological advances and changes in regulation may shape the (future) geographical scope of banking. We first review how both physical distance and the presence of borders currently affect bank lending conditions (loan pricing and credit availability) and market presence (branching and servicing). Next we discuss how technology and regulation have altered this impact and analyse the current state of the European banking sector. We discuss both theoretical contributions and empirical work and highlight open questions along the way. We draw three main lessons from the current theoretical and empirical literature: (i) bank lending to small businesses in Europe may be characterized both by (local) spatial pricing and resilient (regional and/or national) market segmentation; (ii) because of informational asymmetries in the retail market, bank mergers and acquisitions seem the optimal route of entering another market, long before cross-border servicing or direct entry are economically feasible; and (iii) current technological and regulatory developments may, to a large extent, remain impotent in further dismantling the various residual but mutually reinforcing frictions in the retail banking markets in Europe. We conclude the paper by offering pertinent policy recommendations based on these three lessons. |
|
Abraham Bernstein, Esther Kaufmann, Christoph Bürki, Mark Klein, Object Similarity in Ontologies: A Foundation for Business Intelligence Systems and High-performance Retrieval, In: Twenty-Fifth International Conference on Information Systems, December 2004. (Conference or Workshop Paper)
Finding good algorithms for assessing the similarity of complex objects in ontologies is central to the functioning of techniques such as retrieval, matchmaking, clustering, data-mining, semantic sense disambiguation, ontology translations, and simple object comparisons. These techniques provide the basis for supporting a wide variety of business intelligence computing tasks like finding a process in a best practice repository, finding a suitable service provider/outsourcing partner for agile process enactment, dynamic customer segmentation, semantic web applications, and systems integration. To our knowledge, however, there exists no study that systematically compares the prediction quality of ontology based similarity measures. This paper assembles a catalogue of ontology based similarity measures that are (partially) adapted from related domains. These measures are compared to each other within a large, real-world best practice ontology as well as evaluated in a realistic business process retrieval scenario. We find that different similarity algorithms reflect different notions of similarity. We also show how a combination of similarity measures can be used to improve both precision and recall of an ontology-based, query-by-example style, object retrieval approach. Combining the study’s findings with the literature we argue for the need of extended studies to assemble a more complete catalogue of object similarity measures that can be evaluated in many applications and ontologies. |
|
Abraham Bernstein, Esther Kaufmann, Norbert E. Fuchs, June von Bonin, Talking to the Semantic Web -- A Controlled English Query Interface for Ontologies, In: 14th Workshop on Information Technology and Systems, December 2004. (Conference or Workshop Paper)
The semantic web presents the vision of a distributed, dynamically growing knowledge base founded on formal logic. Common users, however, seem to have problems even with the simplest Boolean expression. As queries from web search engines show, the great majority of users simply do not use Boolean expressions. So how can we help users to query a web of logic that they do not seem to understand?
We address this problem by presenting a natural language front-end to semantic web querying. The front-end allows formulating queries in Attempto Controlled English (ACE), a subset of natural English. Each ACE query is translated into a discourse representation structure – a variant of the language of first-order logic – that is then translated into the semantic web querying language PQL. As examples show, our approach offers great potential for bridging the gap between the semantic web and its real-world users, since it allows users to query the semantic web without having to learn an unfamiliar formal language. |
|
Trea Laske-Aldershof, Erik Schut, Konstantin Beck, Stefan Gress, Amir Shmueli, Carine Van de Voorde, Consumer mobility in social health insurance markets: a five-country comparison, Applied Health Economics and Health Policy, Vol. 3 (4), 2004. (Journal Article)
During the 1990s, the social health insurance schemes of Germany, the Netherlands, Switzerland, Belgium and Israel were significantly reformed by the introduction of freedom of choice (open enrolment) of health insurer. This was introduced alongside a system of risk adjustment to compensate health insurers for enrolees with predictable high medical expenses. Despite the similarity in the health insurance reforms in these countries, we find that both the rationale behind these reforms and their impact on consumer choice vary widely.
In this article we seek to explain the observed variation in switching rates by cross-country comparison of the potential determinants of health insurer choice. We conclude that differences in choice setting, and in the net benefits of switching, offer a plausible explanation for the large differences in consumer mobility.
Finally, we discuss the policy implications of our cross-country comparison. We argue that the optimal switching rate crucially depends on the goals of the reforms and the quality of the risk-adjustment system. In view of this, we conclude that switching rates are currently too low in the Netherlands, and an active government policy to encourage consumer mobility seems warranted. In Germany and Switzerland, high switching rates call for an improvement of the rather poor risk-adjustment systems. Given low switching rates in Israel and Belgium, improving risk adjustment is less urgent, but still required in the long run. |
|
Daniel Fasnacht, Der neue Schub kommt oft unerwartet, In: Neue Zürcher Zeitung , 274, p. 11, 23 November 2004. (Newspaper Article)
|
|
Thomas Gschwind, Martin Pinzger, Harald Gall, TUAnalyzer--Analyzing Templates in C++ Code, In: Proceedings of the 11th Working Conference on Reverse Engineering (WCRE 2004), IEEE Computer Society, November 2004. (Conference or Workshop Paper published in Proceedings)
In this paper, we present TUAnalyzer, a novel tool that extracts the template structure of C++ programs on the basis of the GNU C/C++ Compiler’s internal representation of a C/C++ translation unit. In comparison to other such tools, our tool is capable of supporting the extraction of function invocations that depend on the particular instantiation of C++ templates and to relate them to their particular template instantiation. TUAnalyzer produces RSF format output that can be easily fed into existing visualization and analysis tools such as Rigi or Graphviz. We motivate why this kind of template analysis information is essential to understand real-world legacy C++ applications. We present how our tool extracts this kind of information to allow others to build on our results and further use the template information. The applicability of our tool has been validated on real code as proof of concept. The results obtained with TUAnalyzer enable us and other approaches and tools to perform detailed studies of large (open source) C/C++ projects in the near future. |
|
Peter Zweifel, Improved risk information, the demand for cigarettes, and anti-tobacco policy, Journal of Risk and Uncertainty, Vol. 23 (3), 2004. (Journal Article)
This paper purports to develop a simple microeconomic model designed to shed light on behavioral change induced by improved information about smoking risks. It predicts the conditions in which improved information indeed increases the demand for cigarettes. After recalling the economic rationale of an anti-tobacco policy, the article points out a few startling implications of improved information about the risks of smoking. |
|
Mathias Hoffmann, International capital mobility in the long run and the short run: can we still learn from saving–investment data?, Journal of International Money and Finance, Vol. 23 (1), 2004. (Journal Article)
The idea to learn about international capital mobility from saving and investment data remains appealing. Our approach is based on VAR methods and overcomes some of the problems associated with saving–investment regressions when the data are non-stationary. We propose a new measure of long-run capital mobility that can be easily calculated as a by-product of the estimation procedure of a cointegrated VAR. In an application to historical US and British data, we find long-run capital mobility to have been remarkably stable over the century whereas variations in the mobility of capital primarily seem to have affected short-run capital flows. |
|
Andrea Schenker-Wicki, Qualität messen – Qualität managen: Leistungsparameter im Studium, In: Jahreskonferenz des Projekts Qualitätssicherung, 2004-10-26. (Conference or Workshop Paper)
|
|
Reto Föllmi, Urs Meister, Konsum hängt nicht von Bahn und Bus ab. Kritik an Studie zum Nutzen des öffentlichen Verkehrs, In: Neue Zürcher Zeitung, 248, p. 17, 23 October 2004. (Newspaper Article)
|
|
Günter Müller, Torsten Eymann, Norbert Nopper, Sven Seuken, EMIKA System: Architecture and Prototypic Realization, In: Proceedings of the IEEE International Conference on Systems, Man and Cybernetics (SMC), The Hague, The Netherlands, 2004. (Conference or Workshop Paper published in Proceedings)
Life critical applications in hospital
environments have got special requirements concerning
IT support: a real-time cooperation is necessary to ensure
a continuous workflow. This article describes the
prototypic realization of the EMIKA project, a real-time
controlled mobile information system for clinical
applications. The EMIKA architecture is comprised of
three layers: first of all, the Communication Layer
provides wireless device interaction. Secondly, the
Middleware Layer establishes a common service platform
for automated service provision and access. Finally, the
Application Layer implements a multi-agent-system for
the real-time coordination needed for a self-organizing
patient logistics environment. |
|
Andrea Schenker-Wicki, Marco Demont, MBA Rankings unter der Lupe: Rankings hinterfragen, In: Alpha - Der Kadermarkt der Schweiz, p. ?, 9 October 2004. (Newspaper Article)
|
|
Thorsten Hens, Beate Pilgrim, Sunspot equilibria and the transfer paradox, Economic Theory, Vol. 24 (3), 2004. (Journal Article)
We show that for international economies with two countries, in which agents have additively separable utility functions, the existence of sunspot equilibria is equivalent to the occurrence of the transfer paradox. This equivalence enables us to provide some new insights on the relation of the existence of sunspot equilibria and the multiplicity of spot market equilibria. |
|
Helmut Schauer, Interactive Learning, In: Netties 2004 Conference, Budapest, Hungary, October 2004. (Conference or Workshop Paper)
|
|