Verification Workshop - VERIFY'02

What are the verification problems? What are the deduction techniques?

Copenhagen, Denmark, July 25-26, 2002

in connection with FLOC 2002

Accepted papers


  1. A. W. Appel, N. G. Michael, A. Stump, R. Virga: A Trustworthy Proof Checker, Joint with FCS

    Abstract: Proof-Carrying Code (PCC) and other applications in computer security require machine-checkable proofs of properties of machine-language programs. The main advantage of the PCC approach is that the amount of code that must be explicitly trusted is very small: it consists of the logic in which predicates and proofs are expressed, the safety predicate, and the proof checker. We have built a minimal-TCB checker, and we explain its design principles, and the representation issues of the logic, safety predicate, and safety proofs. We show that the trusted code in such a system can indeed be very small. In our current system the TCB is less than 2,700 lines of code (an order of magnitude smaller even than other PCC systems) which adds to our confidence of its correctness.

  2. A. Armando, M.-P. Bonacina, S. Ranise, M. Rusinowitch, A. K. Sehgal: High-performance deduction for verification: a case study in the theory of arrays

    Abstract: We outline an approach to use ordering-based theorem-proving strategies as satisfiability procedures for certain decidable theories. We report on experiments with synthetic benchmarks in the theories of arrays with extensionality, showing that a theorem prover - the E system - compares favorably with the stae-of-the-art validity checker CVC.

  3. A. Armando, L. Compagna: Automatic SAT-Compilation of Protocol Insecurity via Reduction to Planning, Joint with FCS

    Abstract: We provide a fully automatic translation from security protocol specifications into propositional logic which can be effectively used to find attacks to protocols. Our approach results from the combination of a reduction of protocol insecurity problems to planning problems and well-known SAT-reduction techniques developed for planning. We also propose and discuss a set of transformations on protocol insecurity problems whose application has a dramatic effect on the size of the propositional encoding obtained with our SAT-compilation technique. We describe a model-checker for security protocols based on our ideas and show that attacks to a set of well-known authentication protocols are quickly found by state-of-the-art SAT solvers.

  4. B. Beckert, U. Keller, P.H. Schmidt: Translating the Object Constraint Language into First-order Predicate Logic

    Abstract: In this paper we define a translation of UML class diagrams with OCL constraints into first-order predicate logic. The goal is logical reasoning about UML models, realized by an interactive theorem prover. We put an emphasis on usability of the formulas resulting from the translation, and we have developed optimisations and heuristics to enhance the efficiency of the theorem proving process. The translation has been implemented as part of the KeY system, but our implementation can also be used stand-alone.

  5. C. Benzmüller, C. Giromini, A. Nonnengart, J. Zimmer: Reasoning Services in MathWeb-SB for symbolic verification of Hybrid Systems

    Abstract: We propose to apply mathematical service systems developed in the context of the Calculemus initiative to support the verification of hybrid systems. For this we want to identify and analyse different kinds of mathematical subtasks occurring in industrial-strength examples of hybrid system verification. These kinds of subtasks, suitable for being tackled with reasoning specialists, can be modeled as mathematical service requests to a network of service systems like the Mathweb Software Bus. For the deductive model checking approach we have identified the following candidates of mathematical service requests: The solution of differential equations, subsumption of sets of constraints, and their solution. A further candidate can be the elimination of second order set variables with higher order theorem provers.

  6. E. Cohen: Proving Protocols Safe from Guessing, Joint with FCS

    Abstract: We describe how to prove cryptographic protocols secure against a Dolev-Yao attacker that can also engage in idealized offline guessing attacks. Our method is based on constructing a first-order invariant that bounds, in every reachable state, both the information available to an an attacker and the steps of guessing attacks starting from this information. We have implemented the method as an extension to the protocol verifier TAPS, making it the first mechanical verifier to prove protocols secure against guessing attacks in an unbounded model.

  7. A. L. Herzog, J. Guttman: Eager Formal Methods for Security Management

    Abstract: Controlling complexity is a core problem in information security. Achieving a security goal in a networked system requires the cooperation of many devices, such as routers, firewalls, virtual private network gateways, and individual host operating systems. Different devices may require different configurations, depending on their purposes and network locations. Many information security problems may be solved given models of these devices and their interactions. We have focused for several years on these problems, using eager formal methods as our approach. Eager formal methods front-loads the contribution of formal methods to problem-solving. The focus is on modeling devices, their behavior as a function of configurations, and the consequences of their interactions. A class of practically important security goals must also be expressible in terms of these models. In eager formal methods, the models suggest algorithms taking as input information about system configuration, and returning the security goals satisfied in that system. In some cases, we can also derive algorithms to generate configurations to satisfy given security goals. The formal models provide a rigorous justification of soundness. By contrast, algorithms are implemented as ordinary computer programs requiring no logical expertise to use. Resolving practical problems then requires little time, and no formal methods specialists. We have applied this approach to several problems. In this extended abstract, we briefly describe two problems and the modeling frameworks that lead to solutions. The first is the distributed packet filtering problem, in which filtering routers or firewalls are located at various points in a network with complex topology. The problem is to constrain the flow of different types of packets through the network. The second problem concerns configuring gateways for the IP security protocols (IPsec); the problem is to ensure that authentication and confidentiality goals are achieved for specific types of packets traversing particular paths through the network. Solutions to these problems have been published and implemented. We also describe how to unify the two solutions, so that packet filtering goals and IPsec authentication and confidentiality are jointly enforced on a network.

  8. D. Kröning: Application Specific Higher Order Logic Theorem Proving

    Abstract: Theorem proving allows the formal verification of the correctness of very large systems. In order to increase the acceptance of theorem proving systems during the design process, we implemented higher order logic proof systems for ANSI-C and Verilog within a framework for application specific proof systems. Furthermore, we implement the language of the PVS theorem prover as well-established higher order specification language. The tool allows the verification of the design languages using a PVS specification and the verification of hardware designs using a C program as specification. We implement powerful decision procedures using Model Checkers and satisfiability checkers. We provide experimental results that compare the performance of our tool with PVS on large industrial scale hardware examples.

  9. C. Meadows: Identifying Potential Type Confusion in Authenticated Messages, Joint with FCS

    Abstract: A type confusion attack is one in which a principal a principal accepts data of one type as data of another. Although it has been shown by Heather et al. that there are simple formatting conventions that will guarantee that protocols are free from simple type confusions in which fields of one type are substituted for fields of another, it is not clear how well they defend against more complex attacks, or against attacks arising from interaction with protocols that are formatted according to different conventions. In this paper we show how type confusion attacks can arise in realistic situations even when the types are explicitly defined in at least some of the messages, using examples from our recent analysis of the Group Domain of Interpretation Protocol. We then develop a formal model of types that can capture potential ambiguity of type notation, and describe a procedure for determining whether or not the types of two messages can be confused.

  10. G. Steel, A. Bundy, E. Denney: Finding Counterexamples to Inductive Conjectures and Discovering Security Protocol Attacks, Joint with FCS

    Abstract: We present an implementation of a method for finding counterexamples to universally quantified conjectures in first-order logic. Our method uses the proof by consistency strategy to guide a search for a counterexample and a standard first-order theorem prover to perform a concurrent check for inconsistency. We explain briefly the theory behind the method, describe our implementation, and evaluate results achieved on a variety of incorrect conjectures from various sources. Some work in progress is also presented: we are applying the method to the verification of cryptographic security protocols. In this context, a counterexample to a security property can indicate an attack on the protocol, and our method extracts the trace of messages exchanged in order to effect this attack. This application demonstrates the advantages of the method, in that quite complex side conditions decide whether a particular sequence of messages is possible. Using a theorem prover provides a natural way of dealing with this. Some early results are presented and we discuss future work.

  11. V. Vanackère: The TRUST protocol analyser, Automatic and efficient verification of cryptographic protocols

    Abstract: The paper presents TRUST, a verifier for cryptographic protocols. In our framework, a protocol is modeled as a finite number of processes interacting with an hostile environment; the security properties expected from the protocol are specified by inserting logical assertions on the environment knowledge in the process. Our analyser relies on an exact symbolic reduction method combined with several techniques aiming to alleviate the number of interleavings that have to be considered. We argue that our verifier is able to perform a full analysis on up to 3 parallel (interleaved) sessions of most protocols. Moreover, authentication and secrecy properties are specified in a very natural way, and whenever an error is found an attack against the protocol is given by our tool.


Serge Autexier
Last modified: Fri Jun 14 14:02:06 MEST 2002 Created: Mon Dec 11 10:17:33 MET 2000