Markus Aderhold: Synthesis of Induction Axioms for Procedures with Second-Order Recursion
Many algorithms on data structures such as terms (finitely branching trees) are naturally implemented by second-order recursion: A first-order procedure 'f' passes itself as an argument to a second-order procedure like 'map', 'forall', 'foldl', 'foldr', etc. to recursively apply 'f' to the direct subterms of a term. The synthesis of suitable induction axioms for such algorithms is essential to automate the verification of properties of these algorithms. We describe how our implementation synthesizes induction axioms from terminating procedures that are defined by second-order recursion. We show how information obtained from termination analysis can be used to optimize (i.e., generalize) the resulting induction axioms.
David Aspinall, Christoph Lüth: Towards a generic hierarchical proof theory.
Previous work introduced HiTac, a generic tactic programming language based on Denney's hi-proofs. This talk explores some extensions of HiTac: namely, a notion of assumption (contexts), and how to treat meta-variables.
Serge Autexier: DocTip - System Architecture
Starting with the general goals of the DOCTIP system, we use a specific classical formal specification example to illustrate an instance of the DOCTIP system for mediating between user interfaces, the Hets system, the theorem provers and a generic management of change system. We present a walk through an example to show how the information flow through the different components is organised when during proof development as well as how changes are propagated. We conclude with the general architecture of DOCTIP and the main features it will provide once implemented.
Alan Bundy: The Architecture of the GALILEO Project
We outline the proposed architecture for the GALILEO Project. This has three main parts:
1. Ontology Repair Plans: These are encoded in some polymorphic, higher-order programming language, such as LambdaProlog or ML, but we will continue to explore other options. Each repair is triggered by the derivation of its trigger formula. It then manipulates some old ontologies and creates some new ones. It communicates via an API with a development graph representation of physics knowledge.
2. Development Graph: We have a choice to use HETS/DocTip or locales to encode a large number of inter-related ontologies using the nodes and morphisms of a development graph. In the first case the encoding must be in CASL or HasCASL, in the second case in Isabelle/HOL's formalism.
3. Higher-Order Theorem Prover: A higher-order theorem prover will be used to derive formulae matching the trigger formulae of the ontology repair plans. HETS and DocTip have a variety of attached provers. Locales comes with the Isabelle prover.
Michael Chan: Enabling the Implementation of GALILEO
In this talk, I will discuss the issues arising from the proposed architecture and the implementation of GALILEO. These include: the translation of the logical formalisation of the ontologies into LambdaProlog and CASL; the use of Huet's zippers to switch between shallow and deep embedding of the logic; and the manipulation of development graphs as a way to demonstrate the effect of repair..
Lassaad Cheikhrouhou and Werner Stephan: Inductive Verification of Password Protocols:
A major issue of password protocols is the resistence against offline dictionary attacks. In our talk we present an inductive verification approach for the absence of these kind of attacks. It is based on an algebraic Dolev-Yao model up to now used for debugging in tools like AVISPA. The abilities of an intruder are given by equalities of a message algebra. Our inductive method, including proofs of resistence against offline dictionary attacks is based on the initial model induced by these equations.
Dominik Dietrich: Specification and Compilation of Deep Inference Rules
Deep inference deductive systems for classical logic provide exponentially shorter proofs than the sequent calculus systems while increasing the number of possible matchings and thus applicable inference rules at each step. In practical applications one thus has always to consider the tradeoff between shorter proofs and higher nondeterminism. In my talk I will present a specification language for annotated deep inference rules and show how the specified inferences can be compiled to native Lisp code.
Lucas Dixon: From logic to DAGs with a eye to changing notation
I will talk about how directed acyclic graphs can be used to provide a notion of logic and to accommodate for changes to notation. In particular, I'll consider supporting changes to data-types while maintaining a consistent theory. Other non-monotonic operations might also be touched upon.
Lilia Georgieva: First-order logic for program validation
We propose an approach for validating that pointer programs do not corrupt their datastructures. The approach is applicable to imperative programs with destructive update of pointer fields and uses first-order axiomatisations of shape types. We show how to reason effectively about pointer programs using first-order theorem provers and compare their performance.
Mark Guhe: Local contexts in human and machine intelligence
The main difference between human and machine intelligence is that machines are good on small-scale, well-defined problems, while the strength of humans is to behave intelligently in a complex, information-rich environment. My hypothesis is that the difference between both is not the fundamental ability of logical reasoning (humans are much better at this than they are usually given credit for), but the human ability to establish a local context for their reasoning – an ability that machines lack. By using local contexts the (usually) large amount of knowledge that an intelligent system has is reduced so that it becomes possible to obtain a result in `real' time. Computational cognitive models can show a way how such local contexts are established (e.g., associative memories, rational analysis), and Information Flow theory can provide a principled approach to connect the different local contexts (local logics).
Dieter Hutter: DocTip - Management of Change
In this talk we introduce the general methodology and techniques for the management of change in the future DocTip-system. DocTip will support the development process of critical software and allows one to intertwine the development of specification with proof (or analysing consistency) activitiies. The change management, which will support a variety of different document types, is split into three phases. The first phase is concerned with the detection of semantic differences based on existing techniques for XDiff. The second phase is concerned with the analysis of which results of analysis are invariant of the performed changes while in the last step deals with the analysis and replay of previous activiries which are potentially affected by the changes.
Anne Kayem: Data Outsourcing in Service Oriented Architectures: Mitigating Security Challenges.
The paradigm of data outsourcing has become an important and cost-effective way of handling data management in service oriented architectures. Basically, the concept of outsourced data management allows organisations to transfer data management to third party organisations. Since these data transfers raise issues pertaining to privacy and security, effective security architectures that handle these scenarios are needed to mitigate the dangers of information exposure. We present a model for establishing access control and discuss some of the challenges involved in ensuring information flow control in the outsourced data scenario.
Manfred Kerber: Normalizations for Testing Heuristics in Propositional Logic
Typically it is considered a strength of a language that the same situation can be described in different ways. However, when a human or a program is to check whether two representations are essentially the same it is much easier to deal with normal forms. For instance, (infinitely many) different sets of formulae may normalize to the same clause set. In the case of propositional logic formulae with a fixed number of boolean variables, the class of all clause sets is finite. However, since the number of clause sets grows doubly exponentially, it is not feasible to construct the complete class even for only four boolean variables. Hence further normalizations are necessary when the full class is to be studied. Such normalizations allow to systematically test heuristics on all problems for small numbers of propositional logic variables and answer the question whether on the whole class of problems there holds a free lunch theorem for heuristics or not.
Christopher Krauss: Property preserving refinement in hybrid automata
Embedded systems are ubiquitous (cars, airplanes, mobile devices, ...). They are often large and complex as they are combinations of many distributed control units. Formal verification helps in detecting errors and allows to provide correctness guaranties already early in the development process. Formal verification of embedded systems is often based on hybrid automata. Hybrid automata allow to describe discreet state changes as well as the continuous behavior of a system. Following a top down approach system behavior is typically described at a very abstract level. In my talk I present a refinement technique for hybrid automata that allow to transfer behavioral properties to more concrete models and thus to concrete implementations on concrete hardware. My talk will start with a short overview on hybrid automata. I will continue with an introduction into property preserving refinement on hybrid automata and talk about different refinement techniques. I will then concentrate on refinement by rules, that is, providing a set of pre-proven transformation rules that, if applied exclusively result in property preserving refinements. I conclude my talk with a brief introduction to our new research project IsReal and the integration of hybrid automata in this project.
Christoph Lange: Integrating Web Services into Active Mathematical Documents
Jos Lehmann: A Case Study of Ontology Evolution in Atomic Physics as the Basis of the Open Structure Ontology Repair Plan
In the GALILEO project a number of Ontology Repair Plans are being developed and implemented in higher-order logic. These plans resolve a contradiction between two or more ontologies that represent the domain of physics. In this talk, the transition from Thomson's plum pudding model of the atom to Rutherford's planetary model is used as the main inspiration of a new ontology repair plan called Open Structure.
Roy McCasland: Theorem-Producing Rules -- One Year On
This talk will describe the progress made in the past year regarding theorem-producing rules, which were introduced during CIAO-08. The talk will highlight some of the advantages and disadvantages of using these rules, and will present some of the results produced by the rules.
Till Mossakowski: The heterogeneous tool set
Heterogeneous specification becomes more and more important because complex systems are often specified using multiple viewpoints, involving multiple formalisms. Moreover, a formal software development process may lead to a change of formalism during the development. However, current research in integrated formal methods only deals with ad-hoc integrations of different formalisms. The heterogeneous tool set (Hets) is a parsing, static analysis and proof management tool combining various such tools for individual specification languages, thus providing a tool for heterogeneous multi-logic specification. Hets is based on a graph of logics and languages (formalized as so-called institutions), their tools, and their translations. This provides a clean semantics of heterogeneous specifications, as well as a corresponding proof calculus. For proof management, the calculus of development graphs (known from other large-scale proof management systems) has been adapted to heterogeneous specification. Development graphs provide an overview of the (heterogeneous) specification module hierarchy and the current proof state, and thus may be used for monitoring the overall correctness of a heterogeneous development. Heterogeneous proof scripts allow for replaying heterogeneous developments.
Alison Pease: Metaphor and Analogy in Mathematics
I shall outline various characteristics of metaphors and analogy and discuss some examples from mathematics.
Florian Rabe: A Module System for a Logical Framework
Module systems for proof assistants provide administrative support for large developments when mechanizing the meta-theory of programming languages and logics. We describe a module system for the logical framework LF. It is based on two main primitives: signatures and signature morphisms, which provide a semantically transparent module level and permit to represent logic translations as homomorphisms. Modular LF is a conservative extension over LF, and defines an elaboration of modular into core LF signatures. We have implemented our design in the Twelf system and used it to modularize large parts of the Twelf example library.
Scheme-based theorem discovery and concept invention
Mathematical theories usually have infinitely many theorems (and concepts), and an Automatic Theorem Discovery (ATM) system is expected to address the generation and identification of the interesting ones. This research project outlines an automated reasoning system for helping a user to navigate through a theory and discover theorems that mathematicians generally find to be interesting. The project focuses on the processes of invention of mathematical concepts, and invention (and verification) of conjectures about concepts by using a scheme-based approach.
Marvin Schiller: Putting Adaptive Modelling of Proof Step Size for Mathematics Tutoring to the Empirical Test
In tutorial dialogs on mathematical proofs, intermediate proof steps are often skipped when this seems appropriate. The adequate step size depends on the tutoring context and the knowledge of the student. Based on the assertion level inference mechanism in the mathematical assistant system Omega, we classify whether a (multi-inference) proof step is of appropriate size within a proof attempt. These granularity classifiers are learnt from proof samples annotated with the help of expert tutors. Besides the assessment of step size in a student's proof attempt, our modeling technique also allows to generate granularity-adapted proof presentations from formal proofs in Omega. In this talk, we present our automated framework for collecting annotated proof samples from expert tutors, for generating classifiers using standard machine-learning tools, and for evaluating the performance of these classifiers.
Lutz Schröder: TBox Reasoning in Coalgebraic Hybrid Logic
It has been recognised that the expressivity of ontology languages benefits from the introduction of non-standard modal operators beyond the usual existential restrictions and the number restrictions already featured by many description logics. Such operators serve to support notions such as uncertainty, defaults, agency, obligation, or evidence, which are hard to capture using only the standard operators, and whose semantics often goes beyond relational structures. We work in a unified theory for logics that combine non-standard modal operators and nominals, a feature of established description logics that provides the necessary means for reasoning about individuals; in particular, the logics of this framework allow for internalisation of ABoxes. We reenforce the general framework by proving decidability in EXPTIME of concept satisfiability over general TBoxes; moreover, we discuss example instantiations in various probabilistic logics with nominals.
Alan Smaill: A variant of signature morphisms.
The literature on institutions usually assumes that signature morphisms are arity-preserving maps on the relevant grammatical categories of atomic syntax. However it is possible to allow maps that drop arguments or permute arguments to functions or predicates. This is useful in certain circumstances, and also gives a way to characterise some forms of symmetry in logics.
Frank Theiss: Higher-Order and First-Order TPTP in Practice
I will report on experience with the TPTP language(s) and infrastructure gained in recent projects. The focus of the talk will be on features of the TPTP language and their practical aspects. The applications we will present include the cooperation between different systems and logics, as realized in the higher-order automated theorem prover Leo-II, or the processing of proof objects, as in the SigmaKEE ontology environment.