Stefan Schurgast and, Markov logic inference on signal/collect, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2010. (Master's Thesis)
Over the last several years, the vision of a SemanticWeb has gained support from a vast of different
fields of application. Meanwhile, there is a large number of datasets available with a tendency to
interlink between each other, ready to be analyzed. But RDFS/OWL and SWRL, the standard
languages for representing ontological knowledge and rules in RDF lack because of their limited
expressiveness. Markov logic provides a good solution to this problem by putting weights on
formulas, generalizing first-order logic with a probabilistic approach, allowing also contradictory
rules.
By successfully implementing and evaluating the execution of loopy belief propagation on Markov
networks using the Signal/Collect framework, an elegant, and yet highly efficient solution is ready
to be provided for the use in further applications. |
|
Alessandra Macri, Market activities constituting jurisdiction or applicable law in electronic service contracts with international connection, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2010. (Bachelor's Thesis)
Connecting factors constituting jurisdiction and applicable law in international contracts in
Europe, the U.S. and China can be sub-divided into two categories: 1) Those related to a
party's presence in a state, and 2) Actions a party performs in a state. Of special importance
for online contracts are actions taking place before contract formation and
constituting an invitation to treat, i.e. provision of an online shop and advertising. Online
service providers can avoid abroad litigations by demonstrating a clear targeting intent
towards a state. It will be discussed how businesses can show a clear targeting intent,
whether it will be recognized by courts and how rulings could be adapted to better comply
to the characteristics of online business. |
|
Stefan Badertscher, Nutzungsbarrieren von IT in der Anlageberatung bei Schweizer Banken, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2010. (Bachelor's Thesis)
Today’s Information Technology has widely pervaded banks in Switzerland. Swiss banks
are highly dependent on the use of Information Technology. But with the financial advisory
of Swiss banks there exists one exception, where nowadays IT is hardly used.
Therefore, customer meetings are held without support of IT. Meanwhile there exist
no detailed reasons which could explain these circumstances. The pervading use of IT
in the financial advisory could be seen as an adoption of an innovation. Hence, in this
thesis there will be a theoretical discussion about the adoption of innovations by using
common models of technology acceptance and theories, which explain the diffusion and
adoption of innovations. Based on a literature review, barriers which avoid the usage
of technology in companies will be identified. Based on this literature review, there will
be a classification of potential barriers, which may avoid a usage of technology in the
financial advisory of Swiss banks. |
|
Serge Hänni, Interactive feature detector for biomedical structural analysis, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2010. (Bachelor's Thesis)
Histological dental growth analyses are conducted to obtain various biological, anthropological and forensic conclusions,
ranging from phylogenetic insights to time estimation such as the individual's age at death. They focus
on manual counting and measurement of certain growth structures found in dental hard tissues. As these analyses
are tedious to perform and are prone to observer errors, computational tools and algorithms are needed to facilitate
their performance and increase their reliability. To cope with this problem, concepts that support a high degree
of user interactivity with a software framework are developed. The implementation allows to manually annotate
dental structures on digital images in several ways and includes a first approach to an automatic detection of
these dental structures. It is shown that the semi-automatic detection needs to be further improved and that
further tools are needed to simplify studies and their reproducibility among researchers. |
|
Alessandro Vagliardo, MidPerm - Realisierung einer Provenance SQL-Erweiterung als Middleware Lösung, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2010. (Master's Thesis)
I have bought a bottle of milk and now I want to know from which farm and cows the milk
came from. Such questions can be answered by provenance applications. There are only a few
general solutions which every database can use.
MidPerm is a first try to generate such a solution. MidPerm separates the Perm module which
is integrated into the Postgres database. Perm calculates the provenance data for a SQL query.
A JDBC interface allows a communication with MidPerm. Through this interface MidPerm
allows all databases with a JDBC interface to use the provenance calculation. |
|
Kevin Massie, Design und Implementierung eines Prototyps zur teilautomatisierten Bestimmung des Gerichtsstands und des anwendbaren Rechts, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2010. (Master's Thesis)
Even though international contracts for commercially provided electronic services in the Internet usually cover provisions on jurisdiction and applicable law, these provisions are often determined in a legally non-compliant way.
The legal frame to be consistent with when determining jurisdiction and applicable law for an international contract of civil and commercial matters is laid out by the respective provisions of private international law (PIL), which is not easy to understand as the procedure to determine jurisdiction and applicable law is difficult to handle.
Hence, this thesis developes a prototype for semi-automated determination of jurisdiction and applicable law to support the contract parties.
To implement the prototype, the respective PIL shall be modeled first. At the end, an evaluation in terms of functionality validation shall be made. |
|
Urs Zoller, Ein Vergleich dreier verschiedener Konzepte zur Anforderungsmodellierung von Software Produktlinien: eine Fallstudien-gestuetzte Untersuchung, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2010. (Bachelor's Thesis)
Software product line engineering is a field of increasing importance in the area of software engineering.
The goal is to exploit commonalities between software products and therefore increase the development
efficiency and quality. Requirements engineering is central to systematically create the product line as a whole
and to generate concrete product specifications.
In research and industry there exist different concepts for such a requirements modeling. So far there exists
no scientific evidence on the advantages and disadvantages of these concepts.
This Bachelor thesis contributes to reduce this gap. An existing case study of a software product line was
systematically modeled with different concepts in three modern tools. This modeling was then analyzed in detail,
many different quantitative and qualitative measurements were taken and these were used to derive concrete
strengths and weaknesses of the different concepts. The results will be useful for industry to guide decision
processes for selecting a concrete modeling tool and also for tool developers to understand the potential and
opportunities for improvement of the concepts and tools. |
|
Sacha Gilgen, Mobile healthcare on Android devices, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2010. (Bachelor's Thesis)
Recent technological advances in wireless sensors and mobile communication enable new
types of healthcare systems. Especially the availability of small, lightweight and ultra-low
power sensors gives new possibilities for a continuously monitoring of human biomedical
data and therefore allows an early detection of potential illness.
This bachelor thesis presents a prototype mobile healthcare system intended to monitor
ECG data from a patient. The system consists of three parts: a wearable sensor that
records ECG data on a patient, a mobile phone that acts as a gateway by forwarding
the data received from the sensor and finally a server containing a visualization module
to graphically display the recorded ECG data. Based on the displayed ECG, doctors are
able to make some basic medical diagnostics for their patients. |
|
Roger Peyer, Design and prototypical implementation of a secure distributed electronic program guide for LiveShift, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2010. (Bachelor's Thesis)
P2P networks gain more and more importance in applications that need a high amount
of resources. Since On Demand TV applications are very bandwidth intensive, LiveShift
has been developed. LiveShift is a fully distributed P2P video streaming application, that
allows each peer to publish a channel. The published channels can be watched in a time
shifted manner by any peer within the network, since the P2P network serves as storage
component. To locate video streams on the network, a basic electronic program guide
(EPG) has been implement. The EPG though lacks a possibility to administer certain
EPG data privately, such as the schedule of a channel. The modification of a channel
schedule should be restricted to the channel publisher only. In addition peers do not have
the possibility to create EPG data in a collaborative way. This paper discusses the design
and implementation of a secure distributed EPG. The Design chapter explains the security
concepts needed to allow a peer to administer EPG data privately and mechanisms to
support peer collaboration. The chapter Implementation describes how these concepts are
implemented in the distributed secure EPG. The results will be evaluated and conclusions
will be drawn. |
|
Frank Neugebauer, Klimavisualisierung für das Reisebüro der Zukunft, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2010. (Bachelor's Thesis)
The current development of travel advisory is beginning to use computer systems in
collaborative
environments. One of these projects is SmartTravel, which lacks high
usability and
basic information while exploring the travel climate. This extension, called
Klimavisualisierung,
which was created using User Centered Design, solves these problems by
enabling users
to visualize different climatic key data. It is shown through a SUS evaluation that
this add-on
increases the response effectiveness and efficiency with climate related questions thus laying
the foundation for more fine-grained climate advisory in the future. |
|
Peter Höltschi, Impact of collaborative information technology: an explorative analysis in projects of Swiss financial institutions, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2010. (Master's Thesis)
Collaborative information technology (CIT) such as instant messaging and video conferencing
supports collaborative work. A recent international study has shown that utilization of CIT in
organizations is still low. This thesis examines impacts of CIT to better understand these
circumstances. For this purpose a research framework was constructed to gather information
about CIT use and its impacts in an explorative field study. Interviews in 10 Swiss financial
institutions were conducted, with users and providers of CIT. The results show support that CIT
leads to impacts like higher effectiveness, quality, time savings, but also higher transparency
and team cohesiveness. Significant differences in weighting of impacts between users and
providers were found. Implications are drawn for further research. |
|
Roman Liesch, Tracking a robot, 2010. (Other Publication)
This paper will give some insight into a program that was developed for the purpose of
tracking a moving object. A picture, showing the object that is to be tracked, is the point
of departure. The SIFT-algorithm uses this reference image to find the object from
images that were extracted from a movie. The result of this step are coordinates of
potential matches, where the object being tracked is on the picture. This data is then
used to calculate the mean coordinates of the matches. These values are used to
calculate the object’s velocity and the distance that the robot has moved. This paper
puts special emphasis on the problems that have to be tackled in order to produce a
program that works and how these problems were solved. |
|
Maya Schneebeli, Sensory motor exploration for prosthetic control, 2010. (Other Publication)
Myoeletric control of robotic prostheses is a widely used approach in medical research. In
this work, different algorithms for analyzing the coherence between movement and EMG
signals are used and compared. The goal was to find a method which is not only capable of
finding coherences between the movement and its directly associated EMG, but also
indirectly associated EMG. This aspect is important for later use in upper limb protheses
control.
An important finding of this work is that EMG signals and movements have a linear,
negative or constant coherence, depending on muscle function. This method provides a fast
way to compare signals offline. It turned out, that for finding an appropriate method, many
factors have to be considered. A challenge for future work is to find a suitable method for
analyzing data online. If we know how different movement signals are connected to each
other and connected to EMG signals, it would be a big step forward to recognize and
predict movements by means of few EMG sensors. |
|
Denis Sudakov, Elliptische Kurven: von der Theorie zur Anwendung in der Kommunikation, 2010. (Other Publication)
|
|
Claudio Steffen, Quality recovery: an evaluation of static code analysis tools, 2010. (Other Publication)
The quality of an evolving system typically disintegrates as time passes, for example, because of
new and unforseen requirements. RAPS is a tool used by SwissLife AG for complex calculations
of product data. RAPS is written in C and is composed of about 1.5 million lines of code and has
evolved for over ten years. We assume that the quality of RAPS can be improved by identifying
and fixing quality issues. This thesis evaluates three tools, namely Bauhaus Suite, Imagix 4D
and the combination of Sotograph and Sotoarc, that might facilitate the process of recovering the
quality of RAPS. The goal is to evaluate what can be achieved by using the tools and to estimate
which of the tools is best suited to analyze RAPS or similar systems. Special attention is payed to
the role that visualization techniques play in the process of identifying and fixing quality issues. |
|
Karin Bühler, Performance analysis of provenance computation using query rewrite techniques on a commercial database system, 2010. (Other Publication)
The aim of the paper is to compare the performance of provenance computation in Perm
(Provenance extension of the relational Model), a relational provenance management system
operating on the open source DBMS PostgreSQL, with the performance of provenance
computation using the commercial database system DB2. To be able to use DB2 for
provenance computation, the Perm query rewrite techniques are used to generate SQL queries
that generate provenance information. The performance of the systems is evaluated using
TPC-H, a decision support benchmark. The results of the experimental evaluation indicate
that Perm outperforms DB2 on this workload. |
|
Samuel Mezger, Spatial data structures benchmarking framework, 2010. (Other Publication)
This paper is a documentation to a `Facharbeit' at the Visualization
and Multimedia Lab at the University of Zürich. The task was to
implement a framework in C++ to benchmark the performance of different spatial
data structures for three dimensional point data. A focus was kept on a direct
representation of the theory with clean object orientation and a clear
separation of concerns, good maintainability and extensibility of the resulting
code.
Data is read in from vertices stored in a ply file and loaded into a
structure as configured by the user. Various manipulation of this data is performed, such as
accessing and moving points or finding neighbours. For benchmarking, the time
these operations take is measured.
This paper begins with a short introduction into the theoretical background of
selected data structures, then the implementation is described by explaining
the general approach and specific problems and their solutions. Finally, some
initial benchmarking results are shown, that lead to the conclusion that there
is no one best data structure among those tested (grid, bucket point region
kd-tree and bucket sliding midpoint kd-tree), but the sliding midpoint kd-tree
performs more predictably than the point region version.
For actual use, the data structures would possibly have to be re-implemented,
as their implementations for the benchmark framework is intended for relative
performance comparison only, not for absolute efficiency. |
|
Daniel Bisig, Tatsuo Unemi, Cycles - Blending Natural and Artificial Properties in a Generative Artwork, In: Proceedings of the Generative Art Conference, 2010. (Conference or Workshop Paper)
Cycles is an interactive installation that establishes an intimate relationship between the visitor's physical body and simulated organisms. It explores notions of transience and identity that draw inspiration from Buddhist philosophy. Cycles creates a situation that causes the visitor to experience his or her own body in a state of mutability and transience. Cycles merges the appearance of the visitor's hand with a visual representation of a swarm simulation. By bridging the gap between the virtual and physical, a hybrid entity comes into existence whose rapidly changing body blends artificial and natural properties. This hybrid entity progresses through a life cycle that reenacts the four Buddhist sufferings. |
|
Daniel Bisig, Milieux Sonores - Klangliche Milieus. Klang, Raum und Virtualitä, In: null, 2010. (Book Chapter)
|
|
Lorenz Hilty, Lorenz Erdmann, Scenario Analysis: Exploring the Macroeconomic Impacts of Information and Communication Technologies on Greenhouse Gas Emissions, Journal of Industrial Ecology, Vol. 14 (5), 2010. (Journal Article)
During the past decade, several macroeconomic studies on
the potentials of information and communication technology
(ICT) to reduce greenhouse gas (GHG) emissions have been
published. The mitigation potentials identified in them vary to
a high degree, mainly because they are not consistently defined
and diverse methodologies are applied. The characteristics of
ICT—exceptional dynamics of innovation and diffusion, social
embedment and cross-sector application, diverse and complex
impact patterns—are a challenge for macroeconomic
studies that quantify ICT impacts on GHG emissions.
This article first reviews principal macroeconomic studies
on ICT and GHG emissions. In the second part, we reconsider
our own study on this topic and present an in-depth
scenario analysis of the future impacts of ICT applications on
GHG emissions. We conclude that forthcoming macroeconomic
studies could strengthen the state of the art in environmental
ICT impact modeling (1) by accounting for the
dynamics of new ICT applications and their first-, second-,
and third-order effects on a global scale, (2) by reflecting the
error margins resulting from data uncertainty in the final results,
and (3) by using scenario techniques to explore future
uncertainty and its impacts on the results. |
|