School of Information Technology and Electrical Engineering Publications - UQ eSpace
http://espace.library.uq.edu.au/
The University of QueenslandenFez http://blogs.law.harvard.edu/tech/rssA Fibred Tableau Calculus for Modal Logics of Agents
http://espace.library.uq.edu.au/view/UQ:13298
In previous works we showed how to combine propositional multimodal logics using Gabbay's \emph{fibring} methodology. In this paper we extend the above mentioned works by providing a tableau-based proof technique for the combined/fibred logics. To achieve this end we first make a comparison between two types of tableau proof systems, (\emph{graph} $\&$ \emph{path}), with the help of a scenario (The Friend's Puzzle). Having done that we show how to uniformly construct a tableau calculus for the combined logic using Governatori's labelled tableau system \KEM. We conclude with a discussion on \KEM's features.2007-04-02T16:05:12Z
Padmanabhan, V.; Governatori, G A field study of aging in paper-oil insulation systems
http://espace.library.uq.edu.au/view/UQ:309767
The paper used to insulate the windings of power transformers is mostly made from wood pulp, a cellulosic material. Over decades the paper is slowly attacked by water, oxygen, oil acids, and high temperatures and eventually degrades to the point where it is no longer an effective insulator. The transformer is then likely to fail. Power utilities need to know when a transformer is nearing the end of its useful life in order to plan its replacement. However, a problem with monitoring the condition of the paper within a transformer is that it may be difficult to obtain a sample to test. Furthermore, a particular sample may not accurately reflect the overall paper condition. A power transformer operating in Australia failed in 2010. Thus we had the opportunity to study the paper condition at various points within the transformer and evaluate the validity of the current understanding of paper aging. In this article we discuss the mechanisms of cellulose degradation, and the associated equations, and apply them to the paper insulation in the failed transformer.2013-09-20T09:00:43Z
Lelekakis, Nick; Guo, Wenyu; Martin, Daniel; Wijaya, Jaury; Susa, Dejan A field study of two online dry-out methods for power transformers
http://espace.library.uq.edu.au/view/UQ:309772
Dry-out methods for transformers, using cellulose cartridges and molecular sieves, were compared. The latter are preferable for keeping a dry transformer dry, whereas the former are more efficient in drying a wet transformer. In this article, we compare the effectiveness of the cellulose cartridge filter and molecular sieve methods. Two identical transformers with similar insulation wetness were used. We estimated the water content of the paper (WCP) insulation from (1) the water activity of the oil, (2) the dielectric response of the transformer, and (3) the water content of individual paper samples determined by Karl Fischer titration. We also measured the acid number of the oil, its interfacial tension, dielectric strength, furan content, dissolved gas content, and particle count.2013-09-20T09:10:58Z
Lelekakis, Nick; Martin, Daniel; Guo, Wenyu; Wijaya, Jaury; Lee, Meng A finite-difference method for the design of biplanar transverse gradient coil in MRI
http://espace.library.uq.edu.au/view/UQ:240832
This paper presents a finite difference method for the design of gradient coil in MRI. In this method, a linear matrix equation is formulated using a finite-difference approximation of the current density in the source domain and an optimization procedure is then carried out to solve the resulting inverse problem and the coil winding pattern are found. The developed algorithm is tested with a transverse biplanar gradient coil design example. Compared with conventional design methods such as target-field, standard stream function or boundary element schemes, the new design approach is relatively easy to implement and flexible to manage the local winding pattern for 2D or 3D geometries.2011-05-13T18:24:57Z
Zhu, Minhua; Shou, Guofa; Xia, Ling; Li, Xia; Liu, Feng; Crozier, Stuart A finite difference method for the design of gradient coils in MRI-an initial framework
http://espace.library.uq.edu.au/view/UQ:287792
2012-12-23T00:49:20Z
Zhu, Minhua; Xia, Ling; Liu, Feng; Zhu, Jianfeng; Kang, Liyi; Crozier, Stuart A finite element model for the analysis of steady state heat transfer in disc coil transformer winding
http://espace.library.uq.edu.au/view/UQ:309782
A general formulation for the analysis of the steady state heat transfer was developed and implemented in MATLAB, to become part of a coupled thermal-hydraulic model for solving the temperature and fluid flow distributions in a disc coil transformer winding. It uses Finite Element Method and computes temperature distributions in the solid part of the disc coil, i.e. in conductors and insulating paper. The model takes non-uniform heat generation rate and convective boundary conditions for solving the heat conduction problem. Axisymmetry of the coil geometry is used to reduce the problem to two-dimensions. The FEM code developed was validated by comparing computation results with those obtained from COMSOL Multiphysics software. As demonstrated, the results were in very good agreement with one another.2013-09-20T09:40:28Z
Wijaya, Jaury; Czaszejko, Tadeusz; Lelekakis, Nick; Martin, Daniel; Susa, Dejan A First Order Predicate Logic Formulation of the 3D Reconstruction Problem and its Solution Space
http://espace.library.uq.edu.au/view/UQ:77939
This paper defines the 3D reconstruction problem as the process of reconstructing a 3D scene from numerous 2D visual images of that scene. It is well known that this problem is ill-posed, and numerous constraints and assumptions are used in 3D reconstruction algorithms in order to reduce the solution space. Unfortunately, most constraints only work in a certain range of situations and often constraints are built into the most fundamental methods (e.g. Area Based Matching assumes that all the pixels in the window belong to the same object). This paper presents a novel formulation of the 3D reconstruction problem, using a voxel framework and first order logic equations, which does not contain any additional constraints or assumptions. Solving this formulation for a set of input images gives all the possible solutions for that set, rather than picking a solution that is deemed most likely. Using this formulation, this paper studies the problem of uniqueness in 3D reconstruction and how the solution space changes for different configurations of input images. It is found that it is not possible to guarantee a unique solution, no matter how many images are taken of the scene, their orientation or even how much color variation is in the scene itself. Results of using the formulation to reconstruct a few small voxel spaces are also presented. They show that the number of solutions is extremely large for even very small voxel spaces (5 x 5 voxel space gives 10 to 10(7) solutions). This shows the need for constraints to reduce the solution space to a reasonable size. Finally, it is noted that because of the discrete nature of the formulation, the solution space size can be easily calculated, making the formulation a useful tool to numerically evaluate the usefulness of any constraints that are added.2007-08-15T07:24:25Z
Robinson, M.; Kubik, K.; Lovell, B. A flexible method for rapid force computation in elliptical magnets
http://espace.library.uq.edu.au/view/UQ:60950
The design of open-access elliptical cross-section magnet systems has recently come under consideration. Obtaining values for the forces generated within these unusual magnets is important to progress the designs towards feasible instruments. This paper presents a novel and flexible method for the rapid computation of forces within elliptical magnets. The method is demonstrated by the analysis of a clinical magnetic resonance imaging magnet of elliptical cross-section and open design. The analysis reveals the non-symmetric nature of the generated Maxwell forces, which are an important consideration, particularly in the design of superconducting systems.2007-08-14T16:53:06Z
Snape-Jenkinson, C. J.; Crozier, S.; Forbes, L. K. A flexible multiplacation unit for an FPGA logic block
http://espace.library.uq.edu.au/view/UQ:96162
2007-08-24T00:17:39Z
Rajagopalan, K.; Sutton, P. R. A folded-monopole model for electrically small NRI-TL metamaterial antennas
http://espace.library.uq.edu.au/view/UQ:262770
2011-12-02T16:21:34Z
Antoniades, M. A.; Eleftheriades, G. V. A folded quarter-elliptical wideband antenna for portable devices
http://espace.library.uq.edu.au/view/UQ:231166
In the last decade, compact low-profile antennas featuring multiband operation have drawn a considerable interest from designers of portable wireless devices. The demand for these antennas stems from the fact that modern portable devices have to access an increased number of wireless services across the frequency spectrum from approximately 880MHz to 5850MHz.2011-03-04T13:43:27Z
Bialkowski, Marek E.; Razali, Ahmad Rashidy; Boldaji, Ashkan A Formal Analysis of a Business Contract Language
http://espace.library.uq.edu.au/view/UQ:8412
This paper presents a formal system for reasoning about violations of obligations in contracts. The system is based on the formalism for the representation of contrary-to-duty obligations. These are the obligations that take place when other obligations are violated as typically applied to penalties in contracts. The paper shows how this formalism can be mapped onto the key policy concepts of a contract specification language, called Business Contract Language (BCL), previously developed to express contract conditions for run time contract monitoring. The aim of this mapping is to establish a formal underpinning for this key subset of BCL.2006-05-16T00:00:00Z
Governatori, Guido; Milosevic, Zoran A Formal Approach to Negotiating Agents Development
http://espace.library.uq.edu.au/view/UQ:9607
This paper presents a formal and executable approach to capture the behaviour of parties involved in a negotiation. A party is modeled as a negotiating agent composed of a communication module, a control module, a reasoning module, and a knowledge base. The control module is expressed as a statechart, and the reasoning module as a defeasible logic program. A strategy specification therefore consists of a statechart, a set of defeasible rules, and a set of initial facts. Such a specification can be dynamically plugged into an agent shell incorporating a statechart interpreter and a defeasible logic inference engine, in order to yield an agent capable of participating in a given type of negotiations. The choice of statecharts and defeasible logic with respect to other formalisms is justified against a set of desirable criteria, and their suitability is illustrated through concrete examples of bidding and multi-lateral bargaining scenarios.2005-04-11T00:00:00Z
Dumas, Marlon; Governatori, Guido; ter Hofstede, Arthur H. M; Oaks, Philippa A Formal Approach To Protocols And Strategies For (Legal) Negotiation
http://espace.library.uq.edu.au/view/UQ:9619
We propose a formal and executable framework for expressing protocols and strategies for automated (legal) negotiation. In this framework a party involved in a negotiation is represented through a software agent composed of four modules: (i) a communication module which manages the interaction with the other agents; (ii) a control module; (iii) a reasoning module specified as a defeasible theory; and (iv) a knowledge base which bridges the control and the reasoning modules, while keeping track of past decisions and interactions. The choice of defeasible logic is justified against a set of desirable criteria for negotiation automation languages. Moreover, the suitability of the framework is illustrated through two case studies.2005-04-11T00:00:00Z
Governatori, Guido; Dumas, Marlon; ter Hofstede, Arthur H. M; Oaks, Philippa A formal basis for a program compilation proof tool
http://espace.library.uq.edu.au/view/UQ:83960
2007-08-14T13:23:59Z
Wildman, L. P. A Formal Basis for a Program Compilation Proof Tool
http://espace.library.uq.edu.au/view/UQ:2473
This paper presents a case study in verified program compilation from high-level language programs to assembler code using the Cogito formal development system. A form of window-inference based on the Z schema is used to perform the compilation. Data-refinement is used to change the representation of integer variables to assembler word locations.2006-04-11T17:21:35Z
Wildman, Luke A formal denotational semantics of UML in object-z
http://espace.library.uq.edu.au/view/UQ:59394
2007-08-14T15:46:03Z
Kim, S.; Carrington, D. A. A formal development approach for self-organising systems
http://espace.library.uq.edu.au/view/UQ:352734
2015-03-02T09:43:13Z
Li, Q.; Smith, G. A formal framework for modelling and analysing mobile systems
http://espace.library.uq.edu.au/view/UQ:100426
This paper presents a formal framework for modelling and analysing mobile systems. The framework comprises a collection of models of the dominant design paradigms which are readily extended to incorporate details of particular technologies, i.e., programming languages and their run-time support, and applications. The modelling language is Object-Z, an extension of the well-known Z specification language with explicit support for object-oriented concepts. Its support for object orientation makes Object-Z particularly suited to our task. The system structuring techniques offered by object-orientation are well suited to modelling mobile systems. In addition, inheritance and polymorphism allow us to exploit commonalities in mobile systems by defining more complex models in terms of simpler ones.2007-08-23T19:28:48Z
Smith, G. P. A formalism to describe design patterns based on role concepts
http://espace.library.uq.edu.au/view/UQ:203993
Design patterns are typically defined imprecisely using natural language descriptions with graphical annotations. It is also common to describe patterns using a concrete design example with implementation details. Several approaches have been proposed to describe design patterns abstractly based on role concepts. However, the notion of role differs in each approach. The behavioral aspects of patterns are not addressed in the role-based approaches. This paper presents a rigorous approach to describe design patterns based on role concepts.Adopting metamodeling and formalism, our work defines an innovative framework where generic pattern concepts based on roles are precisely defined as a formal role metamodel using Object-Z. Individual patterns are specified using these generic role concepts in terms of pattern role models. Temporal behaviors of patterns are also specified using Object-Z and integrated in the pattern role models. Patterns described this way are abstract, separating pattern realization information from the pattern description. They are also precise providing a rigorous foundation for reasoning about pattern properties. This paper also formalizes the properties that must be captured in a class model when a design pattern is deployed. These properties are defined generically in terms of role bindings from a pattern role model to a class model. They provide a precise basis for checking the validity of pattern utilisation in designs. Our work is supported by tools. We have developed an initial role metamodel using an existing modeling framework, Eclipse Modeling Framework (EMF) and have transformed the metamodel to Object-Z using model transformation techniques. Complex constraints are added to the transformed Object-Z model. More importantly, we implement the role metamodel. Using this implementation, pattern authors can develop an initial pattern role model in the same modeling framework and convert the initial model to Object-Z using our transformation rules. The transformed Object-Z model is then enhanced with behavioral features of the pattern. This tool support significantly improves the practicability of applying formalism to design patterns.2010-04-22T11:31:23Z
Kim, Soon Kyeong; Carrington, David A formal mapping between UML models and object-z specifications
http://espace.library.uq.edu.au/view/UQ:147470
2008-06-06T13:32:14Z
Kim, S.; Carrington, D. A. A formal metamodeling approach to a transformation between the UML state machine and object-z
http://espace.library.uq.edu.au/view/UQ:97290
A significant problem with currently suggested approaches for transforming between models in different languages is that the transformation is often described imprecisely, with the result that the overall transformation task may be imprecise, incomplete and inconsistent. This paper presents a formal metamodeling approach for transforming between UML and Object-Z. In the paper, the two languages are defined in terms of their formal metamodels, and a systematic transformation between the models is provided at the meta-level in terms of formal mapping functions. As a consequence, we can provide a precise, consistent and complete transformation between them.2007-08-24T01:09:46Z
Kim, Soon-Kyeong; Carrington, David A Formal Metamodeling Approach to a Transformation between Visual and Formal Modeling Techniques
http://espace.library.uq.edu.au/view/UQ:10542
Formal modeling notations and visual modeling notations can complement each other when developing software models. The most frequently adopted approach is to define transformations between the visual and formal models. However, a significant problem with the currently suggested approaches is that the transformation itself is often described imprecisely, with the result that the overall transformation task may be imprecise, incomplete and inconsistent. This paper presents a formal metamodeling approach to transform between UML and Object-Z. In the paper, the two languages are defined in terms of their formal metamodels, and a systematic transformation between the models is provided at the meta-level in terms of formal mapping functions. As a consequence, we can provide a precise, consistent and complete transformation between a visual model in UML and a formal model in Object-Z.2004-04-08T00:00:00Z
Kim, Soon-Kyeong; Carrington, David A Formal Model Of Cognitive Processes For An Air Traffic Control Task
http://espace.library.uq.edu.au/view/UQ:10573
This document describes a formal model of the cognitive processes involved in a simplified Air Traffic Control task. The model has been developed as part of the SafeHCI project, which is investigating detection and prevention of human error in safety-critical systems. The model will serve as the basis for development of new techniques for prediction of error sources and classification of error types. This document describes the cognitive model in detail.2004-05-19T00:00:00Z
Connelly, Simon; Lindsay, Peter; Neal, Andrew; Humphreys, Mike A formal model of real-time program compilation
http://espace.library.uq.edu.au/view/UQ:62286
Program compilation can be formally defined as a sequence of equivalence-preserving transformations, or refinements, from high-level language programs to assembler code, Recent models also incorporate timing properties, but the resulting formalisms are intimidatingly complex. Here we take advantage of a new, simple model of real-time refinement, based on predicate transformer semantics, to present a straightforward compilation formalism that incorporates real-time constraints. (C) 2002 Elsevier Science B.V. All rights reserved.2007-08-14T17:46:28Z
Lermer, K.; Fidge, C. A formal model of UML metamodel: The UML state machine and its integrity constraints
http://espace.library.uq.edu.au/view/UQ:97289
2007-08-24T01:09:43Z
Kim, S.; Carrington, D. A. A formal object-oriented approach to defining consistency constraints for UML models
http://espace.library.uq.edu.au/view/UQ:100518
We discuss how integrity consistency constraints between different UML models can be precisely defined at a language level. In doing so, we introduce a formal object-oriented metamodeling approach. In the approach, integrity consistency constraints between UML models are defined in terms of invariants of the UML model elements used to define the models at the language-level. Adopting a formal approach, constraints are formally defined using Object-Z. We demonstrate how integrity consistency constraints for UML models can be precisely defined at the language-level and once completed, the formal description of the consistency constraints will be a precise reference of checking consistency of UML models as well as for tool development.2007-08-23T19:32:38Z
Kim, S.; Carrington, D. A. A Formal Ontology Reasoning with Individual Optimization: A Realization of the Semantic Web
http://espace.library.uq.edu.au/view/UQ:9048
Answering a query over a group of RDF data pages is a trivial process. However, in the Semantic Web, there is a need for ontology technology. Consequently, OWL, a family of web ontology languages based on description logic, has been proposed for the Semantic Web. Answering a query over the Semantic Web is thus not trivial, but a deductive process. However, the reasoning on OWL with data has an efficiency problem. Thus, we introduce optimization techniques for the inference algorithm. This work demonstrates the techniques for instance checking and instance retrieval problems with respect to ALC description logic which covers certain parts of OWL.2005-10-25T00:00:00Z
Pothipruk, Pakornpong; Governatori, Guido A formal V&V framework for UML models based on Model transformation techniques
http://espace.library.uq.edu.au/view/UQ:103130
2007-08-23T21:22:53Z
Kim, S. K.; Carrington, D. A. A framework and tool support for the systematic testing of model-based specifications
http://espace.library.uq.edu.au/view/UQ:67052
Formal specifications can precisely and unambiguously define the required behavior of a software system or component. However, formal specifications are complex artifacts that need to be verified to ensure that they are consistent, complete, and validated against the requirements. Specification testing or animation tools exist to assist with this by allowing the specifier to interpret or execute the specification. However, currently little is known about how to do this effectively. This article presents a framework and tool support for the systematic testing of formal, model-based specifications. Several important generic properties that should be satisfied by model-based specifications are first identified. Following the idea of mutation analysis, we then use variants or mutants of the specification to check that these properties are satisfied. The framework also allows the specifier to test application-specific properties. All properties are tested for a range of states that are defined by the tester in the form of a testgraph, which is a directed graph that partially models the states and transitions of the specification being tested. Tool support is provided for the generation of the mutants, for automatically traversing the testgraph and executing the test cases, and for reporting any errors. The framework is demonstrated on a small specification and its application to three larger specifications is discussed. Experience indicates that the framework can be used effectively to test small to medium-sized specifications and that it can reveal a significant number of problems in these specifications.2007-08-15T02:28:02Z
Miller, T.; Strooper, P. Μίνθα: a framework for auto-programming and testing of railway controllers for varying clients
http://espace.library.uq.edu.au/view/UQ:274105
2012-05-15T14:41:13Z
Süß, Jörn Guy; Carrington, David; Robinson, Neil; Strooper, Paul A framework for correctness criteria on weak memory models
http://espace.library.uq.edu.au/view/UQ:367091
The implementation of weak (or relaxed) memory models is standard practice in modern multiprocessor hardware. For efficiency, these memory models allow operations to take effect in shared memory in a different order from that which they occur in a program. A number of correctness criteria have been proposed for concurrent objects operating on such memory models, each reflecting different constraints on the objects which can be proved correct. In this paper, we provide a framework in which correctness criteria are defined in terms of two components: the first defining the particular criterion (as it would be defined in the absence of a weak memory model), and the second defining the particular weak memory model. The framework facilitates the definition and comparison of correctness criteria, and encourages reuse of existing definitions. The latter enables properties of the criteria to be proved using existing proofs. We illustrate the framework via the definition of correctness criteria on the TSO (Total Store Order) weak memory model.2015-08-11T02:53:30Z
Derrick, John; Smith, Graeme A framework for data quality aware query systems
http://espace.library.uq.edu.au/view/UQ:259058
Data Quality (DQ) is increasingly gaining more importance as organizations as well as individuals are relying on data available from various data sources. User satisfaction from query result is directly related to the quality of data returned to user. In this paper we present a framework for DQ aware query systems focused on three key requirements of proﬁling DQ, capturing user preferences on DQ and processing data quality aware queries.2011-10-23T02:08:33Z
Yeganeh, Naiem K.; Sharaf, Mohamed A. A framework for data quality aware query systems
http://espace.library.uq.edu.au/view/UQ:339487
2014-09-14T00:16:55Z
Yeganeh, Naiem K.; Sadiq, Shazia; Sharaf, Mohamed A. A framework for electricity price spike analysis with advanced data mining methods
http://espace.library.uq.edu.au/view/UQ:23863
There are many techniques for electricity market price forecasting. However, most of them are designed for expected price analysis rather than price spike forecasting. An effective method of predicting the occurrence of spikes has not yet been observed in the literature so far. In this paper, a data mining based approach is presented to give a reliable forecast of the occurrence of price spikes. Combined with the spike value prediction techniques developed by the same authors, the proposed approach aims at providing a comprehensive tool for price spike forecasting. In this paper, feature selection techniques are firstly described to identify the attributes relevant to the occurrence of spikes. A simple introduction to the classification techniques is given for completeness. Two algorithms: support vector machine and probability classifier are chosen to be the spike occurrence predictors and are discussed in details. Realistic market data are used to test the proposed model with promising results.2007-07-05T13:55:53Z
Zhao, J. H.; Dong, Z. Y.; Li, Xue; Wong, K. P. A framework for industry-relevant ontology development
http://espace.library.uq.edu.au/view/UQ:261079
Ontology has been widely used to represent many real world aspects and is prominently used as tool to facilitate shared understanding (and knowledge sharing) in a particular domain. Ensuring that such an ontology is relevant to a particular domain, however, remains a challenging task to the ontology developer. This paper introduces a framework that guides industry-relevant ontology development. The framework follows a typical ontology development cycle and details incremental steps that need to be taken to assure industry-relevance of the ontology. To provide a thorough discussion of the framework, we utilise a previously completed ontology development project that followed the developed framework. The project was specifically aimed at developing an industry-relevant ontology for the compliance management domain and was based on three main inputs, namely, scholarly articles, industry expert/practitioner input and industry reports. Our experience indicates that the application of the developed framework promotes ontology development that utilises industry and academic inputs to assure the developed ontology is relevant to its domain.2011-11-10T18:39:33Z
Abdullah, Norris Syed; Sadiq, Shazia; Indulska, Marta A Framework for Information Processing in the Diagnosis of Sleep Apnea
http://espace.library.uq.edu.au/view/UQ:175691
2009-04-14T18:28:06Z
Abeyratne, U. A framework for lab-based real-time video analysis on distributed camera networks
http://espace.library.uq.edu.au/view/UQ:316440
2013-11-27T15:44:21Z
Dadgostar, Farhad; Bigdeli, Abbas; Mau, Sandra; Smith, Terence; Lovell, Brian A framework for ranking and KNN queries in a probabilistic skyline model
http://espace.library.uq.edu.au/view/UQ:362579
Skyline computation has gained a lot of attention in recent years. According to the definition of skyline, objects that belong to skyline cannot be ranked among themselves because they are incomparable. This constraint limits the application of skyline. Fortunately, due to the recently proposed probabilistic skyline model, skyline objects which contain multiple elements, can now be compared with each others. Different from the traditional skyline model where each object can either be a skyline object or not, in the probabilistic skyline model, each object is assigned a skyline probability to denote its likelihood of being a skyline object. Under this model, two simple but important questions will naturally be asked: (1) Given an object, which of the objects are the K nearest neighbors to it based on their skyline probabilities? (2) Given an object, what is the ranking of the objects which have skyline probabilities greater than the given object? To the best of our knowledge, no existing work can effectively answer these two questions. Yet, answering them is not trivial. For a medium-size dataset (e.g. 10,000 objects), it may take more than an hour to compute the skyline probabilities of all objects. In this paper, we propose a novel framework to answering the above two questions on the fly efficiently. Our proposed work is based on the idea of bounding-pruning-refining strategy. We first compute the skyline probabilities of the target object and all its elements. For the rest of the objects, instead of computing their accurate skyline probabilities, we compute the upper bound and lower bound skyline probabilities using the elements of the target object. Based on lower bound and upper bound of their skyline probabilities, some objects, which cannot be in the result, will be pruned. For those objects, which we are unknown whether they are in the results or not, we need to refine their bounds. The refinement strategy is based on the idea of space partition. Specifically, we first partition the whole dataspace into several subspaces based on the distribution of elements in the target object. When we iteratively do the the refinement of the bounds, we will do the partitioning strategy in each subspace. In order to implement this framework, a novel tree, called Space Partition Tree (SPTree) is proposed to index the objects and their elements. We evaluate our proposed work using three synthetic datasets and one real-life dataset. We report all our findings in this paper.2015-06-09T01:46:37Z
Li, Jianguo; Fung, Gabriel Pui Cheong; Zhou, Wei; Huang, Weiping A framework for reliability assessment of software components
http://espace.library.uq.edu.au/view/UQ:9887
This paper proposes a conceptual framework for the reliability assessment of software components that incorporates test case execution and output evaluation. Determining an operational profile and test output evaluation are two difficult and important problems that must be addressed in such a framework. Determining an operational profile is difficult, because it requires anticipating the future use of the component. An expected result is needed for each test case to evaluate the test result and a test oracle is used to generate these expected results. The framework combines statistical testing and test oracles implemented as self-checking versions of the implementations. The framework is illustrated using two examples that were chosen to identify the issues that must be addressed to provide tool support for the framework.2005-02-18T00:00:00Z
Shukla, Rakesh; Strooper, Paul; Carrington, David A framework for specification-based testing
http://espace.library.uq.edu.au/view/UQ:57378
Test templates and a test template framework are introduced as useful concepts in specification-based testing. The framework can be defined using any model-based specification notation and used to derive tests from model-based specifications-in this paper, it is demonstrated using the Z notation. The framework formally defines test data sets and their relation to the operations in a specification and to other test data sets, providing structure to the testing process. Flexibility is preserved, so that many testing strategies can be used. Important application areas of the framework are discussed, including refinement of test data, regression testing, and test oracles.2007-08-13T16:36:12Z
Stocks, P; Carrington, D A framework for statistical testing of software components
http://espace.library.uq.edu.au/view/UQ:130717
Statistical testing involves the testing of software by selecting test cases from a probability distribution that is intended to represent the software's operational usage. In this paper, we describe and evaluate a framework for statistical testing of software components that incorporates test case execution and output evaluation. An operational profile and a test oracle are essential for the statistical testing of software components because they are used for test case generation and output evaluation respectively. An operational profile is a set of input events and their associated probabilities of occurence expected in actual operation. A test oracle is a mechanism that is used to check the results of test cases. We present four types of operational profiles and three types of test oracles, and empirically evaluate them using the framework by applying them to two software components. The results show that while simple operational profiles may be effective for some components, more sophisticated profiles are needed for others. For the components that we tested, the fault-detecting effectiveness of the test oracles was similar.2008-02-18T15:45:08Z
Shukla, Rakesh; Strooper, Paul; Carrington, David A framework for subsystem-based configuration management
http://espace.library.uq.edu.au/view/UQ:83872
2007-08-14T13:20:48Z
Lindsay, P. A.; Macdonald, A. J.; Staples, M.; Strooper, P. A. A framework for symbolic LTL model checking
http://espace.library.uq.edu.au/view/UQ:318656
2013-12-03T09:28:21Z
Kromodimoeljo, Sentot A Framework for Systematic Specification Animation
http://espace.library.uq.edu.au/view/UQ:10538
Specification animation allows users to pose questions about specifications that can be answered quickly and automatically. This paper presents a framework for systematically animating specifications. Several generic properties are identified to check on specifications. A method is presented that uses variants of the specification to check these properties using an animation tool, and also uses testgraphs (directed graphs that partially model the specification being animated) to check the properties for a large number of interesting states. Tool support for all of the above is also discussed. The framework is demonstrated on a small specification and its application to two larger specifications is discussed. Experience with the framework indicates that it can be used to effectively animate small to medium-sized specifications and that it can reveal a significant number of problems in these specifications.2004-04-08T00:00:00Z
Miller, Tim; Strooper, Paul A framework for table driven testing of Java classes
http://espace.library.uq.edu.au/view/UQ:62654
With the advent of object-oriented languages and the portability of Java, the development and use of class libraries has become widespread. Effective class reuse depends on class reliability which in turn depends on thorough testing. This paper describes a class testing approach based on modeling each test case with a tuple and then generating large numbers of tuples to thoroughly cover an input space with many interesting combinations of values. The testing approach is supported by the Roast framework for the testing of Java classes. Roast provides automated tuple generation based on boundary values, unit operations that support driver standardization, and test case templates used for code generation. Roast produces thorough, compact test drivers with low development and maintenance cost. The framework and tool support are illustrated on a number of non-trivial classes, including a graphical user interface policy manager. Quantitative results are presented to substantiate the practicality and effectiveness of the approach. Copyright (C) 2002 John Wiley Sons, Ltd.2007-08-14T18:01:04Z
Daley, N.; Hoffman, D.; Strooper, P. A Framework for Transmission Planning in a Competitive Electricity Market
http://espace.library.uq.edu.au/view/UQ:8227
In this paper, a framework for optimal transmission system expansion planning in a competitive electricity market environment has been proposed. Open access transmission has created a deregulated power market and brought new challenges to system planning. The goal of transmission planning is to determine an optimal planning strategy for the transmission company. From the planner's view, planning is the process for balancing the multiple conflicting objectives with many constraints. The primary objective of transmission planning is to ensure the reliable supply to the demand as economically as possible. The new approach in this paper is formed to minimize the Expected Energy Not Supplied (EENS), investment cost and maximize the benefit-cost ratio - subject to the power flow and security constraints. The computer program for reliability evaluation of bulk power systems CRUSE is used to perform reliability evaluation of the transmission system with predetermined outages. An advanced genetic algorithms (GAs) is utilized to solve the multi-objective optimisation problem. The advantages of the new approach include 1) it achieves the possible highest reliability with less cost; 2) it maximizes the cost efficiency, which increases the competitive advantage of a transmission company; and 3) the resulting plans contain the planner's preference which is easy to adjust. The planning approach has been illustrated on the Roy Billinton Test System (RBTS).2006-06-15T00:00:00Z
Lu, M.; Dong, Z. Y.; Saha, T. K. A Framework for Unified Design of Fault Detection and Isolation and Optimal Maintenance Policies
http://espace.library.uq.edu.au/view/UQ:8534
Fault detection and isolation (FDI) and design of optimal maintenance policies have been traditionally studied separately by the control community and domain experts on the one hand and the operations research community on the other. The objective of this paper is to provide a unified approach where maintenance decisions are driven by real-time FDI signals. Such an approach allows systematic analysis and design of FDI with the objective of minimizing the overall costs of operations and maintenance (O&M). Our approach relies on the following steps. First, the information about the assets, their likely failure modes (as generated by Failure Modes and Effects Analysis or from historical service data), service business processes, and costs associated with fixing the assets are captured from designers or practitioners. The Unified Modeling Language (UML) is used as an expressive way to capture and display such information. Next, this information is used to arrive at a representation of the asset degradation and maintenance process as a Markov process. Finally, the asset management problem is formulated as an optimal control over the Markov process. We show how the fundamental properties of FDI drive the O&M costs and the solution to the control problem through their impact on transition probabilities of the Markov process. We illustrate the approach by a numerical example for maintaining proper refrigerant charge levels in Rankine Cycle equipment.2006-03-27T00:00:00Z
Sadegh, Payman; Concha, Julio; Stricevic, Slaven; Thompson, Adrian; Kootsookos, Peter A framework of generation investment analysis in a deregulated electricity market using financial price model
http://espace.library.uq.edu.au/view/UQ:104241
2007-08-23T22:11:52Z
Limbu, T. R.; Saha, T K A framework of Ontology Guided Data Linkage for evidence based knowledge extraction and information sharing
http://espace.library.uq.edu.au/view/UQ:317791
There has been a surge of interests in developing probabilistic techniques for linking semantic equivalent datasets. The key objective is to transform the structure of the induced data into a concise synopsis. Current techniques primarily focus on performing pair-wise attribute matching and pay little attention in discovering direct and weighted correlations among ontological clusters through multi-faceted classification. In this research, we introduce a novel Ontology Guided Data Linkage (OGDL) framework for self-organising and discovering schema structures through constructing a hierarchical cluster mapping trees. Furthermore, we extend our OGDL framework by introducing a novel faceted search engine for semantic interoperability of data and subsequent decision support analysis, and use it to map fast cluster browsing, user friendly querying and semantic reasoning learning needs.2013-11-28T18:11:14Z
Gollapalli, Mohammed; Li, Xue