Joachim Kreutzberg, Qualitätsmanagement auf dem Prüfstand : Analyse des Qualitätsmanagements von Informationssystemen, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2000. (Dissertation)
|
|
André Meyer, A rapid application development framework for distributed mobile multi-media: a mobile multi-media architecture for the virtual workplace, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2000. (Dissertation)
The trends towards increased mobility and global business connections force workers and managers alike to communicate faster using multiple new types of media. At the same time, the means of communication must become easier to use and more secure, because the statically localized workplace of today will dissolve in the near future into a distributed and highly mobile multi-media communication platform. As the need for - or the freedom of - mobility as part of altering life styles becomes more and more commonplace, new tools are required that support these new forms of life and work, also called Virtual Workplaces. A virtual workplace is a distributed wireless multi-media system that is aimed at mobile individuals and supports the organization of people working together in groups for a number of projects. The workplace is a virtual one because it provides for mobility in two senses: the mobile individuals and project members may be working at any place using a mobile tool and the individual project members may be distributed all over the world. Hence, there is no necessity for a permanent physical workplace that is owned or shared by any group of individuals. Furthermore, each individual member may be part of a large number of projects cooperating with sets of different people. The current trends in the working field - namely, the specialization and globalization of competence - enforce this new kind of work paradigm. The focus on the workplace is chosen here as an example for the use of the new means of mobility in general. The resulting techniques may be adapted for a wide range of application domains where the communication between mobile people and mobile information access play the central roles. The virtual workplace is designed to provide mobile members of distributed work groups with a multi-media platform that supports them in communicating with each other, and to retrieve and edit documents and information collaboratively wherever and whenever they need it. The individual project members are supported by a set of user interface, communication, and information retrieval agents. These mobile agents act for the user in the background in order to keep him undistracted from his work. The result of this thesis is to conceptualize and implement a Distributed Rapid Application Development Framework for the creation of Mobile Multi-Media applications that work on numerous current and future hardware devices. The virtual workplace is an example of such an application. The usability of mobile communication and information devices is facilitated by novel paradigms for computer-human interaction, such as pen-based user interfaces that mimic the behavior and ease-of-use of natural paper and speech recognition. In the combined employment of sophisticated mobile and agent technologies, virtual workplace scenarios leap far beyond the current state of research in the so-called field of Computer Supported Cooperative Work. The enormous potential of the framework architecture of virtual workplaces stems from new paradigms that are being developed in contrast to the mere extension and retro-fitting of old ones. |
|
Markus Kradolfer, A workflow metamodel supporting dynamic, reuse-based model evolution, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2000. (Dissertation)
Workflow management has received great attention in recent years since it is a key technology for the implementation and automation of business processes. The basic idea in workflow management is to capture formal descriptions of business processes and to support the automatic enactment of the processes based on these formal descriptions.
The focus of this thesis is on the development of a workflow metamodel that supports dynamic workflow model evolution and the reuse of workflow types. The metamodel comprises concepts to capture functional/structural, informational, behavioral, and organizational aspects of workflows. Furthermore, the metamodel includes explicit correctness criteria for the workflow model (i.e., the workflow types defined at a certain point in time) as well as for workflow instances.
Based on the workflow metamodel the problem of model evolution is investigated. A workflow model cannot be assumed to be unchanged during long periods of time. Rather, the workflow model of a workflow management system, similar to the schema of a database system, has to be adapted to its changing environment, reflecting, e.g., new customer requirements and re-engineered business processes. Therefore, workflow model evolution, i.e., the modification of the workflow model over time, should be supported. Furthermore, since workflows may be of long duration and should not have to be aborted in case of model evolution, it should be possible to modify the workflow model in the presence of workflow instances. Thus, dynamic model evolution should be supported. In the approach proposed in this thesis, in contrast to most existing approaches, workflow types are not updated in place, but they are versioned. Whenever a workflow type has to be modified, a new version of the type is derived. Workflow type versioning has the advantage that workflow instances that are in accordance with the new version can be migrated to the new version, whereas the other workflow instances remain associated with the existing version. To efficiently determine whether a workflow can actually be migrated to a target version, an approach is proposed that takes into account the operations by which the versions have been derived from each other. To modify the workflow model, a set of modification operations is provided. The set includes operations to add and delete workflow types and versions, operations to change the state of versions, and operations to change the interface as well as the body of versions.
Besides workflow model evolution, the issue of workflow type reuse is addressed. The different phases of the workflow type reuse process are discussed and a workflow type development process is proposed, which, in contrast to existing approaches, poses a special emphasis on the reuse of existing workflow types. In order to better support the finding of workflow types, a faceted classification scheme is used. Furthermore, the information contained in the model and in the workflow execution history is considered, since adequate information about workflow types should be available to the workflow modeler during the reuse process in order to understand and evaluate workflow types.
Finally, a prototype has been implemented as a proof of concept. |
|
Christian Brauchle, Qualitätscontrolling für Informationssysteme : Ein prozessorientierter Ansatz zur Verbesserung des Informatikcontrolling am Beispiel einer Universalbank, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2000. (Dissertation)
Information systems are used not only within companies but also from outside the company, e.g. for electronic banking or electronic commerce. In addition to the quantitative requirements, these systems also have to satisfy the qualitative needs of internal users and the company's external customers. Thus, the quality of a company's information systems is an increasing factor in success. Quality management is necessary for ensuring the success of corporate information systems. In order to permit company-wide coordination of quality management, a control is needed which provides the relevant feedback - a quality control. Quality control for information systems concerns all quality-relevant activities of a Management Information Systems (MIS) department and co-ordinates these activities with the aim of securing the required quality of information systems effectively and efficiently.
Starting with a detailed analysis of the current state of MIS, quality management, and control of information systems, this thesis systematically develops a framework for the quality control for information systems. This framework provides a better co-ordination of the quality management for information systems throughout the whole company, as well as an increase in performance for the control of information systems. Quality control makes the specific instruments needed available for the planning, implementation and operational processing of information systems as well as information about the effectiveness and efficiency of the quality management activities, e.g. the price cost performance ratio or process reengineering. Few of these instruments are currently in use; their future application should operate with computer-based systems as shown in this thesis by a prototype system.
The data that are currently available for the control of information systems are not sufficient for effective quality control. In order to analyse the functions, data, and organisational implementation of quality control, a specific architecture for information systems is used. With this architecture, processes can be clearly structured and modelled. The modelling of the target processes of quality control for information systems was carried out at a bank, as the banking sector works almost exclusively with computer data and therefore their information systems have to satisfy a high level of requirements. Functionality and the quality of information systems are highly valued in this sector, and thus have stringent requirements for quality control of information systems.
The results of a case study at a large bank show that quality control of information systems is already covered by some activities of the MIS department. Furthermore, recommendations are given for the organisation and the processing of quality control at a bank. This quality control has to provide the MIS department with cross-processing and cross-hierarchical information on the information systems' quality. The preconditions for processing this information are:
* Communication of the requirements of external customers and internal users of information systems to all MIS departments throughout the company.
* Implementation of a unique, process-wide rating system.
* Implementation and periodical revision of standardised documentation.
* The use of continuous control mechanisms.
The requirements that quality control of information systems also provides data about effectiveness and efficiency of the quality management can be fulfilled by quality cost accounting based on activity based costing. If predefined processes are given, the cost of quality and the value of the quality achieved can be counted easily and automatically. Key figures on quality that are sometimes used have to be further evaluated; a homogeneous system on key figures for information systems quality has to be developed e.g. to render a cross-functional performance comparison. The use of a system based on key financial figures is suggested. Also, the use of a balance sheet for quality is especially recommended if the cost and value of quality are to be completely counted and aggregated. This kind of balance sheet provides a quantitative result for the quality management of information systems. Emphasis must be placed on a simple and understandable implementation of such cost accounting key figure and balance sheet systems.
For companies, regardless of the sector they are in, the result of this thesis is that they all have to implement the various elements of quality control for information systems. Differences are seen in the size of the companies: for large ones it is important to implement nearly all the processes shown in this thesis; smaller companies also have to rely on quality control for information systems, but for economical reasons they are advised to implement a selection of these processes, or perhaps individual functions.
Further research based on this thesis can be seen in defining an implementation framework for quality control in companies, developing an integrated computer-based information system for quality control, or analysing common aspects between quality control and risk management. Any further research has to focus on the influence that economical, technical, social, and political changes have for information systems, as well as on a greater fulfillment of requirements for information systems. |
|
Mario Crameri, Effiziente Verrechnung von Kleinsttransaktionen im Internet Commerce, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2000. (Dissertation)
|
|
Stefan Joos, Adora-L - Eine Modellierungssprache zur Spezifikation von Software-Anforderungen, Universität Zürich, Institut für Informatik, Wirtschaftswissenschaftliche Fakultät, 2000. (Dissertation)
The scope of this work is the development of a specification language (Adora-L) intended to describe software requirements and architecture in a single object-oriented framework. This work is motivated in two ways. First, by the severe weaknesses of existing methods in terms of system decomposition. Second, by general ideas about specifications like object-orientation and the usage of hierarchical models. The general goal is to get a comprehensive specification which describes requirements and architecture in an understandable, clear and structured way - even for large-scale specification. As already mentioned the basic idea of the specification language Adora-L is to model the aspects of data, functionality and behaviour in a single hierarchical object framework. Modeling is based on objects (so called abstract objects) instead of classes. Thus, we resolve modeling anomalies that occur in class models. Additionally modeling with abstract objects is more easier, more understandable and more precise than modeling with classes. Whole-part-hierarchies are a key feature of Adora-L. Systems are decomposed by objects, which are components of other first class objects with full object semantics. All aspect descriptions (like descriptions of behaviour, structure or functions) use this primary structure. All aspects are integrated and represented in this single integrated structure. Particularly the behaviour description is based on the statechart mechanism Harel87 and therefore, it supports an integrated behaviour modeling.
To provide powerful abstraction mechanisms is crucial to manage and understand especially large-scale specifications. System decomposition through whole-part hierarchies has proven to be a convenient and powerful abstraction mechanism. It allows for the description of aspects like system structure or behaviour on different levels of abstraction. The usage of abstractions is a fundamental precondition to manage complex problem descriptions. Primary Adora-L is a graphical language: A graphical notation is used to represent the basic structure of a system. Descriptions on a detailed level will be represented textually. Another key feature of Adora-L is to model requirements with a variable degree of formalism. This enables the developer to adjust the description of requirements to cost and risk factors. So, its up to the developer/ to model different aspects or parts of the system with an arbitrary degree of detail. |
|
Marcus Holthaus, Management der Informationssicherheit in Unternehmen, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2000. (Dissertation)
An Information-Oriented Examination of the Reference Object, the Subjects, Institutions, Instruments and Processes of Information Security.
Implementing information security in a business environment is complex and time-consuming. Many forces and influences contribute to the positive outcome of an information security project. Therefore it is an advantage to know as many of them as early as possible. These influences are examined and formalised in this dissertation and have been arranged as a framework. They have been integrated as a process.
There is one most important prerequisite to successfully manage information security: Getting top management commitment to implement information security and to free the resources needed to do so. Initially, this requires management to agree on an appropriate, uniquely formulated information security goal. Actual information security related goals can then be deduced from the general business policy. Business environment developments must be considered as well as internal requirements. An information security policy - informally in the beginning - begins to develop. This policy must be communicated within the enterprise as early and as consistently as possible and must be lived after exemplarily.
Based on this policy, an enterprise-wide organisation must be brought into life. It is needed to initialise and to support the execution of a corresponding information security project within the enterprise. This organisation, which can be lead by an Information Security Delegate, has the following tasks:
* To formulate, to formalise and to spread the information security policy,
* to split the overall project into individual, manageable parts and subjects, e.g. in accordance to department boundaries,
* to promote the co-operation of information security activities between existing institutions inside the enterprise and its integration into business processes,
* to supply strategic and tactical methodologies and procedures to implement information security within the individual departments,
* to co-ordinate the information security activities, most importantly when problems must be solved at a level superior to an individual project,
* to take care of, to provide advice for and to promote the individual information security projects,
* to collect know-how and to pass it on,
* to supply tools for information security administration, awareness promotion etc., and
* to check on implementation and results.
Furthermore, a role model has to be defined. It should describe the information security responsibilities, functions and authorities of each individual person in the enterprise. This model must be formulated in such a way that each person fits into at least one of the roles.
In the proposal formulated in this dissertation, it is the duty of one person per department to co-ordinate local implementation of information security (this role is called Information Security Co-ordinator). This person must guide a process covering the following activities:
* To set the boundaries of the investigation target, concerning width and depth. Setting the width boundary is done by the express inclusion or exclusion of parts of the object investigated. Setting the depth boundary is done by limiting the investigation to specified kinds of objects (information, hardware, software, co-workers etc.) and by restricting the requirements to be considered (availability, confidentiality, obligation etc.),
* to identify and to administer protection objects, possibly collecting additional characteristics (requirements on the individual objects, important risks, object values etc.),
* to carry out a general risk analysis to identify major risks which the protection objects are subject to,
* to carry out specific risk analysises for the more exact consideration of important risks,
* to select measures to reduce the identified risks,
* to discuss and to promote decisions on measures to reduce the identified risks, considering costs and setting deadlines,
* to co-ordinate the implementation of measures, and
* to check on implementation, done by the person in charge, by those affected, by the Information Security Delegate and by other parties.
These components (goal, organisation, role model, process) make up the Information Security Management Framework, which is presented in this dissertation. A short description can be found in chapter 1. A broad view is presented at the beginning of chapter 3 and it is described in detail in chapters 3 to 6. The framework looks at information security as a management function. It is deduced it from the approaches of Rühli Rühli85 and Heinrich Heinrich93. Part I and II of this dissertation are structured according to these approaches:
* The foundation (chapter 2) identifies and defines terms an general information security goals and concepts,
* the reference object (chapter 3) defines which part of the enterprise must be selected for information security and how it must be split into parts,
* the elements of information security (chapter 4) cover institutions, motivations, specific goals and instruments,
* the activities of information security (chapter 5) describe the various information security subjects which can be applied to the parts of the reference model,
* the information security process (chapter 6) describes the procedure in four cycles with five phases each.
A framework like this has not been described yet in any known approach. The existing procedures, which have mostly proven useful in practice, are subjected to a detailed analysis nevertheless, in order to identify their strengths and weaknesses. Is it deduced from this analysis how a new procedure must be constituted, if it were to combine the strengths and to avoid the weaknesses.
This new procedure, called ""ISIWAY 4"", covers the four framework components and is defined step by step in chapter 8. ISIWAY 4 is the first of two ways in which the framework is put into concrete form. It defines the procedure to reach information security in four cycles (hence the 4 in ISIWAY 4), called Minimal Information Security, Appropriate Information Security, Risk-Related Information Security and Comprehensive Information Security. ISIWAY 4 will be subjected to the same detailed analysis as the existing procedures mentioned before, in order to explain how the requirements are fulfilled.
The requirements of the new procedure will are divided in the groups Initiation, Organisation, Implementation and Content. ISIWAY 4 has been designed to be adaptable and scalable. Another required property of ISIWAY is ease of use. In order to achieve this, the procedure must not be too complex. The full ISIWAY 4 procedure is complex, so a more simple version was designed, which can be applied in smaller projects, or which may serve as an introduction to the information security process and should be easier to learn. ISIWAY 1.5 is such a simplification. It will be introduced in chapter 10 and is the second way described here to put the Information Security Framework in concrete. Additionally, this procedure is fully supported by a tool named ISIGO 1.5. This tool is presented in chapter 9 and supports the execution of each step of the ISIWAY 1.5 procedure. In addition, it considers some items of the ISIWAY 4 procedure presented in chapter 8 and is based on structures of an overall data model, which is presented in chapter 9 under the title of ISIGO CENTRAL.
Thus, this dissertation offers a structured, broadly supported and detailed analysis of the information security problem field, defines an information security management framework as a general solution, contains two procedures named ISIWAY 4 and ISIWAY 1.5 which differ in complexity, and it supplies a corresponding tool named ISIGO 1.5.
Therefore, all components necessary to implement information security efficiently and effectively in a business environment have been presented in this dissertation. In particular, the framework developed here (part II) can be used as a basis for further work in the information security management field. |
|
Anca Vaduva, Rule development for active database systems, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 2000. (Dissertation)
Active database management systems promise to provide an effective integration of database concepts with the rule paradigm. Their strength resides in the centralized representation of real-world semantics in form of rules instead of hiding and replicating them in application programs. However, despite their incontestable advantages, active database management systems are not widely used in practice. One of the reasons is the lack of support for application development. This thesis analyzes specific needs of support and proposes solutions, finally materialized as tools, for assisting the process of active application development. First, we provide a comprehensive overview of the life-cycle of active applications, focussing on the development of rules. Among the considered phases, we stress the rule verification and validation, which have to cope with critical problems that are typical for rules, like, e.g., rule conflicts. In this context, we present a novel approach for termination analysis that significantly improves the accuracy of existing methods. By considering composite events, more precise results can be achieved for avoiding nontermination of rule execution. The presented solution is essential for the termination analysis of expressive rule languages, as provided by many advanced active DBMS. Another contribution of this thesis is in the area of rule testing. We present a new approach for dealing with rule-specific problems that have not been addressed until now. In particular, our work focuses on determining the existence of defects caused by conflicts and dependencies between rules. Finally, we introduce and evaluate a set of tools to assist application developers during their work. The toolset provides for graphical interfaces supporting both static activities such as rule editing, browsing, termination analysis, and dynamic activities, such as testing and debugging. Static tools are used during the specification and design of active database systems, i.e., before the execution of applications. Dynamic tools assist the application developer at runtime, when the active database system is operational and rules are processed. |
|
Norbert E. Fuchs, Uta Schwertel, Sunna Torge, A Natural Language Front-End to Model Generation, Journal of Language and Computation, Vol. 1 (2), 2000. (Journal Article)
|
|
Martin Glinz, A Lightweight Approach to Consistency of Scenarios and Class Models, In: Proceedings of the Fourth IEEE International Conference on Requirements Engineering, 2000. (Conference or Workshop Paper)
|
|
Martin Glinz, Nancy Schett, Preliminary Validation of a Lightweight Approach to Consistency of Scenarios and Class Models, No. IFI-2011.0004, Version: 1, 2000. (Technical Report)
|
|
Johannes Ryser, Martin Glinz, SCENT - A Method Employing Scenarios to Systematically Derive Test Cases for System Test, No. IFI-2011.0005, Version: 1, 2000. (Technical Report)
|
|
Reto Schmid, Johannes Ryser, Stefan Berner, Martin Glinz, A Survey of Simulation Tools for Requirements Engineering, No. IFI-2011.0006, Version: 1, 2000. (Technical Report)
|
|
Uta Schwertel, Controlling Plural Ambiguities in Attempto Controlled English, In: Proceedings of the 3rd International Workshop on Controlled Language Applications, 2000. (Conference or Workshop Paper)
|
|
Martin Glinz, Problems and Deficiencies of UML as a Requirements Specification Language, In: Proceedings of the Tenth International Workshop on Software Specification and Design, 2000. (Conference or Workshop Paper)
|
|
Gregory S. Crawford, The impact of the 1992 Cable Act on household demand and welfare, RAND Journal of Economics, Vol. 31 (3), 2000. (Journal Article)
I measure the benefit to households of the 1992 Cable Act in light of strategic responses by cable systems to the regulations mandated by the act. A discrete-choice differentiated-product model of household demand for all offered cable television services forms the basis of the analysis. Aggregation over households and service combinations to the level of the data permits estimation on a cross-section of cable markets from before and after the act. The results indicate that while the regulations mandated price reductions of 10–17% for cable services, observed system responses yielded no change in household welfare. Post-act changes in cable prices are responsible for most of the difference. |
|
Johannes Ryser, Martin Glinz, A Scenario-Based Approach to Validating and Testing Software Systems Using Statecharts, In: 12th International Conference on Software and Systems Engineering and their Applications (ICSSEA’99), CNAM, Paris, 1999-12-08. (Conference or Workshop Paper published in Proceedings)
Scenarios (Use cases) are used to describe the functionality and behavior of a (software) system in a user-centered perspective. As scenarios form a kind of abstract level test cases for the system under development, the idea to use them to derive test cases for system test is quite intriguing. Yet in practice scenarios from the analysis phase are seldom used to create concrete system test cases. In this paper we present a procedure to create scenarios in the analysis phase and use those scenarios in system test to systematically determine test cases. This is done by formalization of scenarios into statecharts, annotation of statecharts with helpful information for test case creation/generation and by path traversal in the statecharts to determine concrete test cases. |
|
Peter Trommler, The Application Profile Model: A Security Model for Downloaded Executable Content, University of Zurich, Faculty of Economics, Business Administration and Information Technology, 1999. (Dissertation)
With the introduction of Java the interest in transferring executable code over the Internet. First the downloaded executable content paradigm has been deployed for animations and active forms. Nowadays, with the introduction of the Network Computer there is a tendency to deploy the downloaded executable content paradigm as a distribution mechanism for general application programs in the Internet.
Executing code downloaded from the Internet raises security issues beyond those found in operating systems. Therefore most systems that implement downloaded executable content offer additional security mechanisms to complement the mechanisms of the operating system.
In this thesis a novel security model for downloaded executable content is developed. The application profile model is defined to merely grant the set of access rights needed by the application, the application profile. Without breaching security, the definition can be relaxed, which results in the definition of the weak application profile model. The issue of application profile selection is addressed and an algorithm is presented to select the application profile in the weak application profile model dynamically at runtime of an application. A prototype implementation demonstrates the feasibility of this approach.
A method for code analysis to determine the set of access rights required for the execution of an application is developed as an alternative approach. Based on an analysis of the theoretical limitations of code analysis methods are discussed to approximate the set of access rights. The method of generalized constants is developed and applied to Java. The application profile model and code analysis are compared and combinations of both approaches are discussed.
To define a security policy based on the application profile model a specification language is defined as an extension to the PLAS language, a general purpose policy language. The extension is defined in such a way that it can be integrated with other specification languages.
The new model is studied in the context of a company environment and management strategies for a security policy for downloaded executable content are developed and evaluated. |
|
Eckart Jäger, Exchange rates and Bertrand oligopoly, Journal of Economics, Vol. 70 (3), 1999. (Journal Article)
The impact of exchange-rate changes on industrial prices seems ambiguous. Incomplete and even "perverse" pass-through has been observed: the import prices in the depreciating country decrease while those in the appreciating country increase. To explain these "counterintuitive" price reactions we consider a situation of international Bertrand competition: two firms, based in different countries, are selling in both countries simultaneously. The profit-maximizing duopolists set the prices for their products in each of the two markets which are segmented on the demand side. We then study the qualitative effect of an exogenous exchange-rate change on the Bertrand-Nash equilibrium. Under the strong assumption of linear demand and cost functions we have "normal" exchange-rate pass-through. However, allowing for more general cost structures in this simple static model enables us to show that the import prices in both countries might move in counterintuitive directions. |
|
Johannes Ryser, Martin Glinz, A Practical Approach to Validating and Testing Software Systems Using Scenarios, In: QWE'99: Third International Software Quality Week Europe, November 1999. (Conference or Workshop Paper)
|
|