[1] John Brier, Lucia Rapanotti, and Jon Hall. Capturing change in socio-technical systems with problem frames. Technical Report 2005/01, 2005. [ bib | .pdf ]
Within organisations, business processes are increasingly captured and supported through socio-technical systems, which incorporate both people and technologies. In today's rapidly changing marketplace, such business processes are more and more complex and volatile, and so are the requirements for their supporting systems. In this paper we consider change in organisations, and propose novel techniques for the analysis of changing requirements in socio-technical systems. Our approach is based on Jackson's Problem Frames, a well-known framework for the representation and analysis of software problems. We extend the framework with a notation for the representation and analysis of change for socio-technical problems.We illustrate the approach on a case study from a real world situation.

[2] Francis Chantree, Adam Kilgarriff, Anne De Roeck, and Alistair Willis. Using a distributional thesaurus to resolve coordination ambiguities. Technical Report 2005/02, 2005. [ bib | .pdf ]
We present a novel method for resolving coordination ambiguities. This type of ambiguity is one of the most pervasive and challenging. We test the hypothesis that the most likely reading of a coordination ambiguity can be indicated by the distributional similarity of terms. Our experiments show that words or phrases in a coordination which have distributional similarity also tend to have -coordination first characteristics.

[3] Francis Chantree, Bashar Nuseibeh, Anne de Roeck, and Alistair Willis. Nocuous ambiguities in requirements specifications. Technical Report 2005/03, 2005. [ bib | .pdf ]
In this paper we present a novel approach that automatically alerts authors of requirements specifications to the presence of potentially dangerous ambiguities in their text. We first establish the notion of 'nocuous' ambiguities, i.e. those that are likely to lead to misunderstandings. We focus on coordination ambiguity, which occurs when words such as -and and -or are used. Our starting point is a dataset of ambiguous phrases from a corpus of requirements specifications, and a collection of associated human judgements about their interpretation. We then use machine learning techniques combined with syntactic, semantic and word distribution heuristics to eliminate instances of text which people interpret easily. We report on a series of experiments and evaluate the performance of our approach against the collection of human judgements. Our machine learning algorithm has an accuracy of 75 percents compared to a 59.6 percents baseline.

[4] Charles B. Haley, Robin C. Laney, and Bashar Nuseibeh. Validating security requirements using structured toulmin-style argumentation. Technical Report 2005/04, 2005. [ bib | .pdf ]
This paper proposes using structured informal argumentation to assist with determining whether the security requirements for a system satisfy the security goals, and whether an eventual realized system can satisfy the security requirements. We call these arguments 'satisfaction arguments', and propose a systematic approach for their construction. A satisfaction argument is typically probabilistic and unique to the system in its context. We use the argument form proposed by Toulmin for evidence-based argumentation, consisting of claims, grounds, warrants, and rebuttals. Building on our earlier work on trust assumptions and security requirements, we show how using satisfaction arguments assists both in locating inconsistencies between security requirements and their respective goals, and in exposing tacit or inconsistent assumptions about the properties of domains and their possible effects on the eventual security of a system.

[5] Jon G. Hall and Lucia Rapanotti. A framework for software problem analysis. Technical Report 2005/05, 2005. [ bib | .pdf ]
The paper introduces a software problem calculus based on a view of requirements engineering proposed by Zave and Jackson, that we hope can underpin techniques and processes in requirements engineering and early software design. The approach is centred around the notion of problem and problem transformation. It is propositional in nature in that it allows for heterogeneous problem descriptions and correctness argumentation for the validation and verification of solutions. We present a number of fundamental rules to populate the calculus including 'divide-and-conquer' problem decomposition, description reification, prior-solution reuse, the use of architectures for solution structuring. In addition, and central to the foundational nature of the calculus is the interpretation and subsequent incorporation into our framework of others' work as problem transformations. This is an on-going task: examples presented in this paper include interpretations of simple goal-oriented decompositions and viewpoints. The calculus also provides rich traceability of the requirements of an original problem through transformation to those of its solution. We use the design of a sluice gate as a simple running example throughout the paper.

[6] Adrian J. Hilton and Jon G. Hall. Developing critical systems with pld components. Technical Report 2005/06, 2005. [ bib | .pdf ]
Understanding the roles that rigour and formality can havein the design of critical systems is critical to anyonewishing to contribute to their development. Whereas knowledge of these issues is good in software development, in the use of hardware - specifically programmable logic devices (PLDs) and the combination of PLDs and software - the issues are less well known. Indeed, even in industry there are many differences between current and recommended practice and engineering opinion differs on how to apply existing standards. This has led to gaps in the formal and rigorous treatment of PLDs in critical systems. In this paper we examine the range of and potential for formal specification and analysis techniques that address the requirements of verifiable PLD programs. We identify existing formalisms that may be used, and lay out the areas of contributions that academia can make to allow high-integrity PLD programming to be as feasible as highintegrity software development.

[7] Jon G. Hall and Lucia Rapanotti. Using padl to specify aframes. Technical Report 2005/07, 2005. [ bib | .pdf ]
In this short technical note we show how PADL - the process algebraic architectural description language of Bernardo, Ciancarini and Donatiello - can be used to specify AFrames. AFrames exist to structure the machine in a Problem Frames development, and are important in that framework as they allow architectural expertise to be captured and reused therein. Because of the close proximity of PADL to other Architectural Description Languages, we assert that this work opens the Problem Frames framework to standard architectural abstractions.

[8] Robin Laney, Michael Jackson, and Bashar Nuseibeh. Composing problems: Deriving specifications from inconsistent requirements. Technical Report 2005/08, 2005. [ bib | .pdf ]
In this paper we demonstrate an approach to system development based on problem decomposition and subsequent (re)composition of sub-problem specifications. We illustrate the work using Problem Frames, an approach to the decomposition of problems that relates requirements, domain properties, and machine specifications. Having decomposed a problem, one approach to solving it is through a process of composing solutions to sub-problems. In this paper, we show that by formalizing system requirements and domain properties using an Event Calculus, we can both systematically derive machine specifications and solve composition problems. We add a prohibit predicate to the event calculus, that prohibits an event over a given time period. This allows a sub-solution to be formalized in a way that provides for run-time conflict resolution. We develop our earlier work on Composition Frames, an approach to composing inconsistent requirements, by adding systematic support and factoring out domain-dependent details. Throughout the paper we use a simple case study to illustrate and validate our ideas.

[9] Debra Trusso Haley, Pete Thomas, Anne DeRoeck, and Marian Petre. A research taxonomy for latent semantic analysis-based educational applications. Technical Report 2005/09, 2005. [ bib | .pdf ]
The paper presents a taxonomy that summarises and highlights the major research into Latent Semantic Analysis (LSA) based educational applications. The taxonomy identifies five main research themes and emphasises the point that even after more than 15 years of research, much is left to be discovered to bring the LSA theory to maturity. The paper provides a framework for LSA researchers to publish their results in a format that is comprehensive, relatively compact, and useful to other researchers.

[10] Alistair Willis. Can online learning materials improve student access to digital libraries? Technical Report 2005/10, 2005. [ bib | .pdf ]
We present preliminary investigations into how text alignment techniques can be used to align the content of undergraduate textbooks against the journal papers from which they were developed, or which may be recommended further reading. We propose that such methods could be used to improve student access to academic material, particularly in distance learning environments. We show that our initial techniques determine which passages of textbooks align against appropriate academic documents, and consider what techniques might be needed for the required finer grained alignment.

This file was generated by bibtex2html 1.98.