Josef Falkinger, Volker Grossmann, Institutions and development: the interaction between trade regime and political system, Journal of Economic Growth, Vol. 10 (3), 2005. (Journal Article)
This paper argues that an unequal distribution of political power, biased to landed elites and owners of natural resources, in combination with openness to trade is a major obstacle to development of natural resource- or land-abundant economies. We develop a two-sector general equilibrium model and show that in an oligarchic society public investments conducive to industrialization - schooling for example - are typically lower in an open than in a closed economy. Moreover, we find that, under openness to trade, development is faster in a democratic system. We also endogenize the trade regime and demonstrate that in a land-abundant economy the landed elite has an interest to support openness to trade. We present historical evidence for Southern economies in the Americas that is consistent with our theoretical results: resistance of landed elites to mass education, comparative advantages in primary goods production in the 19th century globalization wave, and low primary school enrollment and literacy rates. |
|
H Egger, P Egger, Labor market effects of outsourcing under industrial interdependence, International Review of Economics and Finance, Vol. 14 (3), 2005. (Journal Article)
The consequences of international outsourcing in traditional models of trade are already well understood. However, with regard to empirical research there seem to be still some important shortcomings. Empirical studies on the labor market effects of outsourcing are mainly based on the same techniques that have been used for years. In terms of the adopted econometric specifications, one assumption is typical and – as we will show – critical in this regard. Practically all studies we are aware of assume independence between industries and neglect any spillover and feedback effects across industries. In fact, this is at odds with multi-sector general equilibrium models of trade. It is this paper's focus to relax this restrictive assumption and to suggest the use of different econometric methods. We consider national input–output linkages and cross industrial flows of workers as two important channels of inter-industrial spillovers in labor market effects. We focus on these transmission channels in an Austrian panel data set of 21 two-digit industries in the 1990s and find that industrial interdependencies induce a multiplier effect for changes in industry-specific variables such as international outsourcing. Disregarding spillover effects, therefore, leads to a substantial underestimation of the labor market implications of international outsourcing. |
|
M Breuer, Multiple losses, ex ante moral hazard, and the implications for umbrella policies, Journal of Risk and Insurance, Vol. 72 (4), 2005. (Journal Article)
Under certain cost conditions the optimal insurance policy offers full coverage above a deductible, as Arrow and others have shown. However, many insurance policies currently provide coverage against several losses although the possibilities for the insured to affect the loss probabilities by several prevention activities (multiple moral hazard) are substantially different. This article shows that optimal contracts under multiple moral hazard generally call for complex reimbursement schedules. It also examines the conditions under which different types of risks can optimally be covered by a single insurance policy and argues that the case for umbrella policies under multiple moral hazard is limited in practice. |
|
Beat Hotz-Hart, Carsten Küchler, Neue Dynamik im schweizerischen Technologieportfolio, Die Volkswirtschaft, Vol. 78 (1/2), 2005. (Journal Article)
Das Bundesamt für Berufsbildung und Technologie (BBT) erhebt in Zusammenarbeit mit dem Fraunhofer-Institut für Systemtechnik und Innovationsforschung (FhG-ISI) periodisch die Patentanmeldungen des Standorts Schweiz sowie der Schweizer Unternehmen weltweit. Daraus lässt sich ein Technologieportfolio ermitteln. Es bildet die Gesamtheit der technischen Fähigkeiten und Errungenschaften ab, die einem Land bzw. seinen Unternehmen zur Verfügung steht. Die jüngste Erhebung zeigt ein insgesamt hohes Niveau der schweizerischen Innovationstätigkeit, wobei die KMU stark an dieser Entwicklung beteiligt sind. |
|
H Egger, V Grossmann, Non-routine tasks, restructuring of firms, and wage inequality within and between skill-groups, Journal of Economics, Vol. 86 (3), 2005. (Journal Article)
This paper argues that endogenous restructuring processes within firms towards analytical and interactive non-routine tasks (like problem-solving and organizational activities, respectively), triggered by advances in information and communication technologies (ICT) and rising supply of educated workers, are associated with an increase of wage inequality within education groups. We show that this may be accompanied by a decline or stagnation of between-group wage dispersion. The mechanisms proposed in this research are not only consistent with the evolution of the distribution of wages in advanced countries, but also with the evolution of task composition in firms and a frequently confirmed complementarity between skill-upgrading, new technologies and knowledge-based work organization. |
|
R Foellmi, Urs Meister, Product-Market Competition in the Water Industry: Voluntary Non-discriminatory Pricing, Journal of Industry, Competition and Trade, Vol. 5 (2), 2005. (Journal Article)
Since franchise bidding in the piped water industry is problematic due to extensive investment requirements, product-market competition or common carriage is a valuable alternative for the introduction of competition. This paper analyses product-market competition by considering a simple model of interconnection where competition is introduced between vertically integrated neighbouring water suppliers. The model contains water markets specificities such as local and decentralised networks and related difficulties of regulating access charges. Even without any regulation, we show that: (i) an inefficient incumbent will give up its monopoly position and lower the access price far enough so that the low-cost competitor can enter his home market; (ii) efficiency of production will rise due to liberalisation; and (iii) in contrary to prejudicial claims, investment incentives are not destroyed by the introduction of competition for the market. Investments of low-cost firms may even increase. |
|
Rainer Winkelmann, Subjective well-being and the family: results from an ordered probit model with multiple random effects, Empirical Economics, Vol. 30 (3), 2005. (Journal Article)
The previous literature on the determinants of individual well-being has failed to fully account for the interdependencies in well-being at the family level. This paper develops an ordered probit model with multiple random effects that allows to identify the intra-family correlation in wellbeing. The parameters of the model can be estimated with panel data using Maximum Marginal Likelihood. The approach is illustrated in an application using data for the period 1984-1997 from the German Socio-Economic Panel in which both inter-generational and intra-marriage correlations in well-being are estimated. |
|
Barbara Good, Technologie zwischen Markt und Staat: Die Kommission für Technologie und Innovation und die Wirksamkeit ihrer Förderung, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2005. (Dissertation)
Im Zentrum dieses Buches steht die schweizerische Kommission für Technologie und Innovation (KTI), die partnerschaftliche F&E-Projekte zwischen einer Forschungsstätte und einer Firma unterstützt. Dabei interessiert, welches die Wirkungen der KTI-Förderung sind und wie diese Wirkungen zustande kommen. Um diese Frage zu beantworten, werden zunächst theoretische Erörterungen zu den Wirkungen und Wirkungsmechanismen im Bereich der Forschungs-, Technologie- und Innovationsförderung angestellt. Sodann wird eine Meta-Evaluation von vierzehn früheren Evaluationsstudien zur; KTI-Förderung unternommen. Diese dient als Qualitätssicherung für die folgende Evaluationssynthese. In dieser werden die in den bestehenden Evaluationsstudien ermittelten Wirkungen zusammengestellt, in einem letzten empirischen Schritt wird basierend auf den bisher gesammelten theoretischen und empirischen Erkenntnissen eine eigene Wirkungsanalyse durchgeführt. |
|
Sandra Hopkins, Peter Zweifel, The Australian health policy changes of 1999 and 2000: an evaluation, Applied Health Economics and Health Policy, Vol. 4 (4), 2005. (Journal Article)
This article evaluates three measures introduced by the Australian Federal Government in 1999 and 2000 that were designed to encourage private health insurance and relieve financial pressure on the public healthcare sector. These policy changes were (i) a 30% premium rebate, (ii) health insurers offering lifetime enrolment on existing terms and the future relaxation of premium regulation by permitting premiums to increase with age, and (iii) a mandate for insurers to offer complementary coverage for bridging the gap between actual hospital billings and benefits paid.
These measures were first evaluated in terms of expected benefits and costs at the individual level. In terms of the first criteria, the policy changes as a whole may have been efficiency-increasing. The Australian Government mandate to launch gap policies may well have created a spillover moral hazard effect to the extent that full insurance coverage encouraged policy holders to also use more public hospital services, thus undermining the government's stated objective to relieve public hospitals from demand pressure. Without this spillover moral hazard effect, there might have been a reduction in waiting times in the public sector. Secondly, the measures were evaluated in terms of additional benchmarks of the cost to the public purse, access and equity, and dynamic efficiency. Although public policy changes were found to be largely justifiable on the first set of criteria, they do not appear to be justifiable based on the second set. Uncertainties and doubts remain about the effect of the policy changes in terms of overall cost, access and equity, and dynamic efficiency. This is a common experience in countries that have considered shifts of their healthcare systems between the private and public sectors. |
|
H Egger, P Egger, The determinants of EU processing trade, World Economy, Vol. 28 (2), 2005. (Journal Article)
This paper assesses the determinants of European outward and inward processing trade. Thereby, it distinguishes between size, relative factor endowment, (other) cost factors and infrastructure variables. Using a large panel of bilateral processing trade flows of the EU12 countries at the aggregate level over the period 1988–1999, we find that infrastructure variables, relative factor endowments and other cost variables are important determinants for the EU's outward processing trade. Costs also play a key role for the EU's inward processing trade. |
|
H Egger, V Grossmann, The double role of skilled labor, new technologies and wage inequality, Metroeconomica, Vol. 56 (1), 2005. (Journal Article)
We examine the relationship between the supply of skilled labor, technological change and relative wages. In accounting for the role of skilled labor in both production activities and productivity- enhancing "support" activities we derive the following results. First, an increase in the supply of skilled labor raises the employment share of non-production labor within firms, without lowering relative wages. Second, new technologies raise wage inequality only in so far as they give incentives to firms to reallocate skilled labor towards non-production activities. In contrast, skill-biased technological change of the sort usually considered in the literature does not affect wage inequality. |
|
Hannes Egli, The environmental Kuznets Curve: theory and evidence, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2005. (Dissertation)
|
|
S Buehler, The promise and pitfalls of restructuring network industries, German Economic Review, Vol. 6 (2), 2005. (Journal Article)
This paper examines the competitive effects of reorganizing a network industry's vertical structure. In this industry, an upstream monopolist operates a network used as an input to produce horizontally differentiated final products that are imperfect substitutes. Three potential pitfalls of restructuring integrated network industries are analyzed: (i) double marginalization, (ii) underinvestment and (iii) vertical foreclosure. The paper studies the net effect of restructuring on retail prices and cost-reducing investment and discusses policy implications. |
|
Christian Ewerhart, The pure theory of multilateral obligations, Journal of Institutional and Theoretical Economics JITE, Vol. 161 (2), 2005. (Journal Article)
|
|
Beat Fluri, Harald Gall, Martin Pinzger, Fine-Grained Analysis of Change Couplings, In: Proceedings of the 5th International Workshop on Source Code Analysis and Manipulation, IEEE Computer Society, January 2005. (Conference or Workshop Paper)
In software evolution analysis, many approaches analyze release history data available through versioning systems. The recent investigations of CVS data have shown that commonly committed files highlight their change couplings. However, CVS stores modifications on the basis of text but does not track structural changes, such as the insertion, removing, or modification of methods or classes. A detailed analysis whether change couplings are caused by source code couplings or by other textual modifications, such as updates in license terms, is not performed by current approaches.
The focus of this paper is on adding structural change information to existing release history data. We present an approach that uses the structure compare services shipped with the Eclipse IDE to obtain the corresponding fine-grained changes between two subsequent versions of any Java class. This information supports filtering those change couplings which result from structural changes. So we can distill the causes for change couplings along releases and filter out those that are structurally relevant. The first validation of our approach with a medium-sized open source software system showed that a reasonable amount of change couplings are not caused by source code changes. |
|
Gerald Reif, Harald Gall, Mehdi Jazayeri, WEESA - Web Engineering for Semanitc Web Applications, In: Proceedings of the 14th International World Wide Web Conference, Chiba, Japan, January 2005. (Conference or Workshop Paper)
The success of the Semantic Web crucially depends on the existence ofWeb pages that provide machine-understandable meta-data. This meta-data is typically added in the semantic annotation process which is currently not part of theWeb engineering process. Web engineering, however, proposes methodologies to design, implement and maintain Web applications but lack the generation of meta-data. In this paper we introduce a technique to extend existing Web engineering methodologies to develop semantically annotated Web pages. The novelty of this approach is the definition of a mapping from XML Schema to ontologies, called WEESA, that can be used to automatically generate RDF meta-data from XML content documents. We further show how we integrated the WEESA mapping into an Apache Cocoon transformer to easily extend XML based Web applications to semantically annotated Web application. |
|
Gerald Reif, WEESA - Web Engineering for Semantic Web Applications, TU Vienna, 2005. (Dissertation)
In the last decade the increasing popularity of the World Wide Web has
lead to an exponential growth in the number of pages available on the
Web. This huge number of Web pages makes it increasingly difficult for
users to find required information. In searching the Web for specific
information, one gets lost in the vast number of irrelevant search
results and may miss relevant material. Current Web applications
provide Web pages in HTML format representing the content in natural
language only and the semantics of the content is therefore not
accessible by machines. To enable machines to support the user in
solving information problems, the Semantic Web proposes an extension
to the existing Web that makes the semantics of the Web pages
machine-processable. The semantics of the information of a Web page is
formalized using RDF meta-data describing the meaning of the content.
The existence of semantically annotated Web pages is therefore crucial
in bringing the Semantic Web into existence.
Semantic annotation addresses this problem and aims to turn
human-understandable content into a machine-processable form by adding
semantic markup. Many tools have been developed that support the user
during the annotation process. The annotation process, however, is a
separate task and is not integrated in the Web engineering process.
Web engineering proposes methodologies to design, implement and
maintain Web applications but these methodologies lack the generation
of meta-data.
In this thesis we introduce a technique to extend existing XML-based
Web engineering methodologies to develop semantically annotated Web
pages. The novelty of this approach is the definition of a mapping
from XML Schema to ontologies, called WEESA, that can be used to
automatically generate RDF meta-data from XML content documents. We
further demonstrate the integration of the WEESA meta-data generator
into the Apache Cocoon Web development framework to easily extend
XML-based Web applications to semantically annotated Web application.
Looking at the meta-data of a single Web page gives only a limited
view of the of the information available in a Web application. For
querying and reasoning purposes it is better to have the full meta-data
model of the whole Web application as a knowledge base at hand. In
this thesis we introduce the WEESA knowledge base, which is generated
at server side by accumulating the meta-data from individual Web
pages. The WEESA knowledge base is then offered for download and
querying by software agents.
Finally, the Vienna International Festival industry case study
illustrates the use of WEESA within an Apache Cocoon Web application
in real life. We discuss the lessons learned while implementing the
case study and give guidelines for developing Semantic Web
applications using WEESA. |
|
Michele Lanza, Stephane Ducasse, Harald Gall, Martin Pinzger, CodeCrawler: An Information Visualization Tool for Program Comprehension, In: Proceedings of the 27th International Conference on Software Engineering, ACM, St. Louis, MO, USA, 2005. (Conference or Workshop Paper)
CODECRAWLER is a language independent, interactive, software visualization tool. It is mainly targeted at visualizing object-oriented software, and in its newest implementation has become a general information visualization tool. It has been successfully validated in several industrial case studies over the past few years. CODECRAWLER strongly adheres to lightweight principles: it implements and visualizes polymetric views, visualizations of software enriched with information such as software metrics and other source code semantics. CODECRAWLER is built on top of Moose, an extensible language independent reengineering environment that implements the FAMIX metamodel. In its last implementation, CODECRAWLER has become a general-purpose information visualization tool. |
|
Martin Pinzger, Harald Gall, Michael Fischer, Michele Lanza, Visualizing multiple evolution metrics, In: Proceedings of the ACM Symposium on Software Visualization (SoftVis'2005), ACM, St. Louis, Missouri, USA, 2005. (Conference or Workshop Paper)
Observing the evolution of very large software systems needs the analysis of large complex data models and visualization of condensed views on the system. For visualization software metrics have been used to compute such condensed views. However, current techniques concentrate on visualizing data of one particular release providing only insufficient support for visualizing data of several releases. In this paper we present the RelVis visualization approach that concentrates on providing integrated condensed graphical views on source code and release history data of up to n releases. Measures of metrics of source code entities and relationships are composed in Kiviat diagrams as annual rings. Diagrams highlight the good and bad times of an entity and facilitate the identification of entities and relationships with critical trends. They represent potential refactoring candidates that should be addressed first before further evolving the system. The paper provides needed background information and evaluation of the approach with a large open source software project. |
|
Jens Knodel, Isabel John, Dharmalingam Ganesan, Martin Pinzger, Fernando Usero, Jose L. Arciniegas, Claudio Riva, Asset Recovery and Incorporation into Product Lines, In: Proceedings of the 12th IEEE Working Conference on Reverse Engineering, IEEE Computer Society, Pittsburgh, Pennsylvania, USA, January 2005. (Conference or Workshop Paper)
Software product lines aim in having a common platform from which several similar products can be derived. The elements of the platform are called assets and they are managed in an asset base being part of the product line infrastructure. The products are then built on top of the assets. Assets can include own developments, open source or third-party software modules, as well as design and project documents. In the context of the European-wide project FAMILIES we concentrated on techniques used to build the platform with focus on the recovery of these assets from existing systems. We present an approach on how to incorporate existing assets into the product line infrastructure. Thereby we explicitly distinguish the asset origins and the different information sources available. The incorporation is a quality-driven process that is backed up by a set of reverse engineering techniques to evaluate the asset’s internal quality. The quality assessment of an asset is the critical measurement for industrial development organizations in order to incorporate assets into their product line infrastructure. |
|