Stefan Berner, Stefan Joos, Martin Glinz, Entwicklungsrichtlinien für die Programmiersprache JAVA, Informatik: Zeitschrift der schweizerischen Informatikorganisationen, Vol. 4 (3), 1997. (Journal Article)
Entwicklungsrichtlinien sind ein Hilfsmittel, um gemachte Erfahrungen bei der Entwicklung von Software weiterzugeben. Sie helfen Entwicklern, vorhandenen Programmcode zu verstehen und in zukünftig zu erstellendem Programmcode Fehler zu vermeiden. Konsequent und umsichtig angewandt, verbessern sie den Programmierstil und die Lesbarkeit von Programmcode und tragen somit auch zu verbesserter Wartbarkeit von Software bei.
Wie Entwicklungsrichtlinien für die Programmiersprache Java sinnvollerweise aussehen können, was sie enthalten sollten, wie sie gegliedert werden, wie man sie anwendet und aktualisiert, ist Inhalt dieses Beitrags. |
|
Julian Richardson, Norbert E. Fuchs, Development of Correct Transformation Schemata for Prolog Programs, In: Proceedings of the Seventh International Workshop on Logic Program Synthesis and Transformation LOPSTR '97, Springer, 1997. (Conference or Workshop Paper published in Proceedings)
|
|
Daniel Christoph Meier, Generalization and Constraints in Learning Machines, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 1997. (Dissertation)
Computers work in a very precise, tireless, and fast manner. Nevertheless mechanisms by which computers could learn themselves and perform very simple everyday tasks arc virtually unknown. Such mechanisms that allow machines to learn from individual experiences and relate them adequately to other similar situations involve the key issue of how to generalize from past experiences to deal with future situations. Hence, the goal of this thesis is to investigate the laws governing reliability in learning.
This document describes a constructive approach to building learning machines for real-world applications, which try to solve many well-studied issues and problems originating from mathematics and real-world computing. The concept presented is that there arc learning machines, which have generally valid structural constraints incorporated in their architecture in order to achieve satisfactory performance for mastering challenging real-world tasks. Such structural constraints arc given and theoretically evaluated by the constructive and consistent generalization theory developed by Vapnik and Chervonenkis (Vapnik 1995b). Based on this theory, the factors governing generalization arc further utilized to classify and compare learning machines that exploit invariants. These include, among the various, discussed learning machines, learning from hints (Abu-Mostafa 1994), estimating the tangent of invariances by “Tangent Prop” (Simard and LeCun 1994), the group-invariance theorem (Minsky 1961), and the hyper-plane point invariance theorem (Nillson 1965). This comparison is conducted from the perspective of including generally valid geometric restrictions, i.c., geometrical structures learned by the learning machine arc adjusted such that they comply with the sensed data.
Then two now neural network paradigms Me described:
1. Structurally Constrained Resource-Allocating Neural Networks (SCRAN), and
2. Topological Growing Structures (TGS).
Both methods allocate incrementally new neurons and regulate their structure to comply with the sensed or measured data sets. Both rely on geometric constructs founded on sample vectors. The first kind of network belongs to the class of supervised architectures, whereas the second belongs to the class of unsupervised ones.
The first algorithm is based on the assumption of continuity and monotony. The assumption applies to samples of one category belonging to the same partition. For such a partition, the monotony continuity constraint is defined such that all points between the center of gravity of a partition and its samples belong to the same partition. This geometric construct is generally concave. It is the base construct of the learning algorithm, which is adjusted by an empirical risk minimization scheme. Furthermore, the complexity of this algorithm is studied by the above-mentioned generalization theory. This and other benchmarking problems corroborate the superior generalization power of SCRANs.
The second algorithm that implements learning by constraints is established by constructing a topological projection of the data onto its inherent minimized subspace by a self-organizing unsupervised learning scheme. The projection, motivated by Kohonen’s self-organizing feature maps, is constructed by adapting the number of neurons, their weights spawning up the subspace as well as their interconnection topology. Thus, the tessellation is constructed such that the neuronal weights arc only placed in parts of the state space that rue effectively visited by the problem studied. Another important property of the mapping is that it is distribution preserving.
These methods are applied to tasks such as credit worthiness analysis, differentiation between mines and rocks, empirical effective VC-dimension estimation, and phoneme analysis.
To conclude, our two network paradigms, namely SCRANs and TGSs, arc compared to the others mentioned above by applying the presented generalization scheme and the principle of topological learning by structural constraints. Then the various issues and problems arc reevaluated from the perspective of the learning machine, thereby adapting its structural constraints and architecture to the contemplated problem domain. This important principle is called Topological Induction, and it is the key principle for learning machines in the real world. |
|
Mathias Richter, VH-Graphs - a new approach to hierarchical graphs and their application to object-oriented programming, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 1997. (Dissertation)
|
|
Martin Glinz, The Teacher: "Concepts!" The Student: "Tools!" - On the Number and Importance of Concepts, Methods, and Tools to be Taught in Software Engineering Education, In: Proceedings Third International Workshop on Software Engineering Education, 1996. (Conference or Workshop Paper published in Proceedings)
The paper discusses the number and importance of concepts, methods, and tools that should be taught in software engineering education. It is shown that emphasis is quite different from a teacher’s and from a student’ s viewpoint. As a synthesis of both viewpoints, I propose an approach that separates concepts and methods from tools in the curriculum and treats both with proper emphasis. The software engineering curriculum of the Department of Computer Science of the University of Zurich which follows these ideas is sketched. |
|
Abraham Bernstein, Christian P. Schucan, Document and Process Transformation During the Product Life-Cycle, In: International Working Conference on Information and Process Integration in Enterprises - Rethinking Documents (IPIC), 1996. (Conference or Workshop Paper)
Based on our experiences in the corporate banking department of the Union Bank of Switzerland we are convinced that business, IT and organizational aspects have to be considered in an integrated way while developing IT-strategies. IT-strategies are crucial for an effective (business) development because they identify the constant and the changing parts of an IT infrastructure during product life cycle. In order to achieve this, we state three design invariants: the deep structure of the process, the dependencies within the process, and the information handled. We believe that identifying these invariants will lead to a deeper understanding of product-life-cycles. |
|
Gerhard Schwabe, Helmut Krcmar, Parallele Redezeit, Business Computing, Vol. 8, 1996. (Journal Article)
Durch verstärkte Gruppenarbeit in den Unternehmen nimmt auch die Zahl der Konferenzen zu. Computerunterstützung ist dabei noch selten. |
|
Gerhard Schwabe, Die Rolle neuer Informations- und Kommunikationstechnologie für die Bürgerinformation, IM: die Fachzeitschrift für Information Management & Consulting, Vol. 11 (2), 1996. (Journal Article)
Dieser Artikel gibt einen Überblick über die Nutzung und Nutzungspotentiale von Informati-ons- und Kommunikationstechnologie für die Bürgerinformation. Zuerst wird der derzeitige Stand der Nutzung dargestellt. Dann werden einige innovative Anwendungen für die Bürger-information vorgestellt. Aus diesen Beispielen und weiteren Untersuchungen werden Einfluß-faktoren und ein zukünftiger Trend von Bürgerinformationssystemen in Deutschland abgelei-tet. Ein paar Gedanken zu Gelegenheiten für die Informationsindustrie schließen den Artikel. |
|
Gerhard Schwabe, Helmut Krcmar, CSCW Werkzeuge, Wirtschaftsinformatik, Vol. 38 (2), 1996. (Journal Article)
In diesem Beitrag wird ein Überblick über CSCW-Werkzeuge und ihre Designideen gegeben. Die Grundfunktionen von CSCW-Werkzeugen "Gemeinsames Material", "Überbrückung von Raum", "Überbrückung von Zeit" und "Neue Arbeitsformen" werden vorgestellt.
Sodann werden Texteditoren, Zeichenwerkzeuge, Softwaredesignwerkzeuge, Abstimmungswerkzeuge, Ideenlandschaften, Group Support Systeme, Shared Screen- und Shared Window-Werkzeuge, Werkzeuge zur automatischen Protokollierung der Interaktion, Werkzeuge für Telepräsenz und CSCW-Infrastrukturen beschrieben. Vor dem Ausblick wird noch auf einige Werkzeuge für die asynchrone Zusammenarbeit eingegangen. |
|
Helmut Krcmar, Henrik Lewe, Gerhard Schwabe, Herausforderung Telekooperation, Springer, Berlin / Heidelberg, Deutschland, 1996. (Book/Research Monograph)
|
|
Norbert E. Fuchs, Rolf Schwitter, Attempto Controlled English (ACE), In: CLAW 96, First International Workshop on Controlled Language Applications, 1996. (Conference or Workshop Paper)
|
|
Antonioli Denis N., Stefan Berner, Clemens H. Cap, Markus Pilz, Lutz H. Richter, Mathias Richter, Takashi Suezawa, Ein Glossar wichtiger Begriffe zur Java-Technologie, No. IFI-2011.0012, Version: 1, 1996. (Technical Report)
|
|
Thomas Fromherz, Shape from Multiple Cues for 3D-Enhanced Face Recognition : A Contribution to Error Reduction by Employing Inexpensive 3D Reconstruction Methods, University of Zurich, Faculty of Business, Economics and Informatics, 1996. (Dissertation)
|
|
Krystyna W. Ohnesorge, Modelle für die verlustfreie Bilddatenkomprimierung, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 1996. (Dissertation)
|
|
Thomas Billeter, IT-OUTSOURCING: Marktwirtschaftliche Ansätze zur Bereitstellung der IT-Infrastruktur in Unternehmungen, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 1995. (Dissertation)
|
|
Brigitte Baldi, Werner Brettreich-Teichmann, Karin Gräslund, Rainer Hofmann, Dirk Hoyer, Peter Konrad, Helmut Krcmar, Joachim Niemeier, Gerhard Schwabe, Dietrich Seibt, Das Projekt BTÖV, In: Deutscher Multimedia Kongress '95, Springer, Auffahrt zum Informations Highway, Springer 1995, 1995-06-11. (Conference or Workshop Paper)
|
|
Norbert E. Fuchs, Rolf Schwitter, Specifying Logic Programs in Controlled Natural Language, In: CLNLP 95, Workshop on Computational Logic for Natural Language Processing, 1995. (Conference or Workshop Paper published in Proceedings)
|
|
Martin Glinz, Anforderungen: Wie sag ich's und wie schreib ich's auf, unizürich, Magazin der Universität Zürich (1), 1995. (Journal Article)
|
|
Martin Volk, Einsatz einer Testsatzsammlung im Grammar Engineering, Universität Koblenz, 1995. (Dissertation)
Natürlichsprachliche Systeme (von Grammatikprüfprogrammen bis zu Maschineller Übersetzung) umfassen komplexe formale Grammatiken. Aufgrund der Komplexität erfordert deren Aufbau eine ingenieur-wissenschaftliche Herangehensweise, die als ""Grammar Engineering"" bezeichnet wird. Eine grundlegende Ressource im Grammar Engineering-Prozess ist eine Testsatzsammlung, eine systematische Sammlung von Sätzen der Sprache, wobei jeder Satz ein eigenes grammatisches Problem exemplifiziert. Eine solche Satzsammlung kann auf vielfache Weise die Entwicklung von formalen Grammatiken unterstützen. Es wird gezeigt, wie inkrementelles Grammatiktesten mit Hilfe einer Testsatzsammlung organisiert werden kann. Die Vorstellung einer in Prolog implementierten Grammatik-Testumgebung demonstriert die praktische Umsetzbarkeit. |
|
Abraham Bernstein, Chrysanthos Dellarocas, Thomas W. Malone, John Quimby, Software Tools for a Process Handbook, IEEE-Data Engineering, Vol. 18 (1), 1995. (Journal Article)
This paper provides a progress report on the development of software tools in the Process Handbook project currently underway at the MIT Center for Coordination Science. We begin with a brief overview of the project as a whole. Then we focus on software tools emphasizing aspects that relate to workflow control. Finally, we conclude with a brief description of future avenues of research. The process handbook tools help (a) redesign existing organizational processes, (b)invent new organizational processes that take advantage of information technologyand finally (c) automatical ly generate software to support organizational processes. An important related goal is the ability to (d) import and export process descriptions from and to other process modeling architectures. The approach combines in a novel way the ideas of process decomposition, process specialization, and the coordination of dependencies between activities. The paper presents an overview of findings from multiple implementations of this approach. |
|