The British Logic Colloquium, BLC 2016, will be held at the University of Edinburgh, on 7^{th} & 8^{th} September, preceded by the BLC PhD day, on 6^{th}, and followed by a symposium in celebration of Gordon Plotkin's 70th Birthday, GDP70, on 9^{th} September.
Maurizio Lenzerini  Metaquerying in Ontologybased Data Access 

Ontologybased data access is a new paradigm allowing managing data through the lens of a conceptualization of the domain of interest, called ontology. This new paradigm provides several interesting features, many of which have been already proved effective in managing complex information systems. On the other hand, several important issues remain open, and constitute stimulating challenges for the research community. One of them is metaquerying, i.e., answering queries where variables can be used to denote classes and relations in the ontology. In this talk we first provide an introduction to ontologybased data access, illustrating the main ideas and techniques for using an ontology to access the data layer of an information system, and then we discuss recent results on metaquerying.  
Orna Kupferman  From Reachability to Temporal Specifications in Game Theory 
Multiagents games are extensively used for modelling settings in which different entities share resources. For example, the setting in which entities need to route messages in a network is modeled by networkformation games: the network is modeled by a graph, and each agent has to select a path satisfying his reachability objective. In practice, the objectives of the entities are often more involved than reachability. The need to specify and reason about rich specifications has been extensively studied in the context of verification and synthesis of reactive systems. The talk describes a lifting of ideas from formal methods to multiagent games. For example, in networkformation games with regular objectives, the edges of the graph are labeled by alphabet letters and the objective of each player is a regular language over the alphabet of labels. Thus, beyond reachability, a player may restrict attention to paths that satisfy certain properties, referring, for example, to the providers of the traversed edges, the actions associated with them, their quality of service, security, etc. The talk assumes no previous knowledge in game theory. We will introduce the basic concepts and problems, describe how their extension to games with more expressive specification of objectives poses new challenges, and study the resulting games. Joint work with Guy Avni and Tami Tamir. 

Sylvy Anscombe  Measure and dimension in model theory 
The notion of a ‘measurable structure’ was introduced by Macpherson and Steinhorn, motivated by the work on pseudofinite fields by Chatzidakis, van den Dries, and Macintyre. In ongoing joint work between the speaker, Macpherson, Steinhorn, and Wolf we propose the broader framework of ‘generalised measurable structures’. Both the new definition and the original aim to assign a ‘measure’ and a ‘dimension’ to definable sets, nearly uniformly in parameters. Originally, measures were positive real numbers and dimensions were natural numbers; but in the newer framework we combine measure and dimension into one element of a more general algebraic object, which we call a measuring semiring. Consequently, there is a much richer zoo of examples and we have more exotic model theory. In this talk I will introduce these notions and discuss a range of key examples, including pseudofinite fields, the random graph, vector spaces with bilinear forms, and the generic trianglefree graph.  
Diego Figuera  Semantically Tractable Conjunctive Queries 
Conjunctive Queries (CQ) is one of the most basic and studied query languages in database theory, and it corresponds to the primitive positive fragment of firstorder logic. The evaluation problem for CQs is known to be NPcomplete in combined complexity and W[1]hard in parameterized complexity. However, some syntactic fragments such as acyclic CQs and CQs of bounded treewidth are known to be tractable: CQs from these fragments can be evaluated in polynomial time. A natural optimisation problem is whether a CQ can be rewritten into an equivalent CQ from a tractable fragment. In this talk I will report on some of the recent advances in this area for classes of acyclic and bounded treewidth CQs and in the presence of data constraints (eg, functional dependencies, tgds, etc). 

Alexandra Silva  Cantor meets Scott : DomainTheoretic Foundations for Probabilistic Network Programming 
ProbNetKAT is a probabilistic extension of NetKAT with a denotational semantics based on
Markov kernels. The language is expressive enough to generate continuous distributions, which
raises the question of how to effectively compute in the language. In this talk, we will present an alternative
characterization of ProbNetKAT’s semantics using domain theory, which provides the foundations needed
to build a practical implementation. The new semantics demonstrates that it is possible to analyze ProbNetKAT
programs precisely using approximations of fixpoints and distributions with finite support. We develop an
implementation and show how to solve a variety of practical problems including characterizing the expected
performance of traffic engineering schemes based on randomized routing and reasoning probabilistically about
properties such as loop freedom. 

Alex Wilkie  Diophantine properties of definable sets: applications of model theory to number theory 
Marcelo Fiore  An Algebraic Combinatorial Approach to the Abstract Syntax of Opetopic Structures 

The starting point of the talk will be the identification of structure common to treelike combinatorial objects, exemplifying the situation with abstract syntax trees (as used in formal languages) and with opetopes (as used in higherdimensional algebra). The emerging mathematical structure will be then formalized in a categorical setting, unifying the algebraic aspects of the theory of abstract syntax of [2,3] and the theory of opetopes of [5]. This realization conceptually allows one to transport viewpoints between these, now bridged, mathematical theories and I will explore it here in the direction of higherdimensional algebra, giving an algebraic combinatorial framework for a generalisation of the slice construction of [1] for generating opetopes. The technical work will involve setting up a microcosm principle for nearsemirings and subsequently exploiting it in the cartesian closed bicategory of generalised species of structures [4]. Connections to (cartesian and symmetric monoidal)~equational theories, lambda calculus, and algebraic combinatorics will be mentioned in passing. References


Antonin Delpeuch & Alex Simpson  Relating pointfree and algorithmic randomness 
We relate two contrasting approaches to randomness. Algorithmic randomness classifies infinite sequences as random if they pass a collection of randomness tests specified in computabilitytheoretic terms. An alternative pointfree approach considers instead a generalised space (a locale) of random sequences, which is nontrivial even though, in this approach, individual random sequences do not exist (the locale has no points). To connect the two, we give a constructive presentation of the pointfree approach, and interpret it in Hyland's effective topos Eff. In Eff, generalised spaces of random sequences characterise, via their classical points, notions of algorithmically random sequence. 

Johannes Marti  Models, Duality for Nonmonotonic Consequence Relations and Antimatroids 
Jules Hedges & Mehrnoosh Sadrzadeh  Quantification and Scope in Categorical Compositional Distributional Semantics 
Categorical compositional distributional semantics is a model of natural language, which combines the statistical vector space models of words with the compositional models of grammar. We formalise in this model the generalised quantifier theory of natural language, due to Barwise and Cooper. The underlying setting is a compact closed category with bialgebras. We start from a generative grammar formalisation and develop an abstract categorical compositional semantics for it, then instantiate the abstract setting to sets and relations and to finite dimensional vector spaces and linear maps. We prove the equivalence of the relational instantiation to the truth theoretic semantics of generalized quantifiers. The vector space instantiation formalises the statistical usages of words and enables us to, for the first time, reason about quantified phrases, sentences compositionally, and quantifier scope in distributional semantics. 
Leslie Lamport  It Isn't So Hard to Prophesize 

Even about the future. 

Philip Scott  Some Recent Directions in MV and Effect Algebras 
MValgebras are algebraic structures that arose from Lukasiewicz manyvalued logics in the 1950s. They form a rich branch of mathematics, from work of D. Mundici, withamongst othersdeep categorical connections to AF C*algebras and their classification (Ktheory). The related area of Effect Algebras arose independently in the 1990s, in quantum measurement and quantum probability theory. Both subjects have come under increasing investigation by categorists using duality theory and monadic methods. In a different direction, recently M. Lawson and P. Scott developed a "Coordinatization Program" for MV algebras, inspired by von Neumann's Continuous Geometry and nonCommutative Stone Duality in inverse semigroup theory. We characterized countable MValgebras as lattices of ideals of certain Boolean Inverse Semigroups (AF Inverse Monoids). We survey some of the above recent results arising from concrete AF C*algebras, as well as some recent results in Effect Algebras of interest in mixed probabilistic/nondeterministic algebraic theories. 

Glynn Winskel  Strategies with Parallel Causes 
In a distributed game we imagine a team Player engaging a team Opponent in a distributed fashion. Such games and their strategies have been formalised within event structures. However there are limitations in founding strategies on traditional event structures. Sometimes a probabilistic distributed strategy relies on certain benign races where, intuitively, several members of team Player may race each other to make a common move. Although there are event structures which support such parallel causes, in which an event is enabled in several compatible ways, they do not support an operation of hiding central to the composition of strategies; nor do they support probability adequately. An extension of traditional event structures is devised which supports parallel causes and hiding, as well as the mix of probability and nondeterminism needed to account for probabilistic distributed strategies. The extension is tested in the construction of a bicategory of probabilistic distributed strategies with parallel causes. [This is based on joint work with Marc de Visme, ENS Paris] 

Peter D. Mosses  Modular SOS for Control Operators 
This talk presents a modular technique for specifying call/cc and delimited control operators in smallstep SOS. 

Paul Blain Levy  Trace semantics of wellfounded processes via commutativity 
Gordon Plotkin showed that the set of wellfounded processes that perform I/O and finite nondeterminism, taken modulo trace equivalence, is initial among semilattice algebras whose algebra operations commute with binary join. We see that similar phenomena arise for finite probabilistic choice and for unbounded nondeterminism. Joint work with Nathan Bowler (Hamburg). 

Philippa Gardner  Reasoning about Concurrent Programs 
Consider
increment program 

Eugenio Moggi  Models, Overapproximations and Robustness 
Hybrid systems, and related formalisms, have been successfully used to model CyberPhysical Systems. However, mathematical models are always a simplification of the system they are meant to describe, and one must be aware of this mismatch, when using these models for system analyses. In safety analysis it is acceptable to use overapproximations of the system behavior, indeed they are the bread and butter of counterexample guided abstraction refinement (CEGAR). We propose a notion of system behavior robust wrt arbitrary small overapproximations, and argue that it is particularly appropriate for safety analysis. 

Robert Harper  Computational Higher Type Theory 
Formal constructive type theory has proved to be an effective language for mechanized proof. By avoiding nonconstructive principles, such as the law of the excluded middle, type theory admits sharper proofs and broader interpretations of results. From a computer science perspective, interest in type theory arises from its applications to programming languages. Standard constructive type theories used in mechanization admit computational interpretations based on metamathematical normalization theorems. These proofs are notoriously brittle; any change to the theory potentially invalidates its computational meaning. As a case in point, Voevodsky's univalence axiom raises questions about the computational meaning of proofs. We consider the question: Can higherdimensional type theory be construed as a programming language? We answer this question affirmatively by providing a direct, deterministic operational interpretation for a representative higherdimensional dependent type theory with higher inductive types and an instance of univalence. Rather than being a formal type theory defined by rules, it is instead a computational type theory in the sense of MartinLöf's meaning explanations and of the NuPRL semantics. The definition of the type theory starts with programs, and defines types as specifications of program behavior. The main result is a canonicity theorem, the first of its kind, stating that closed programs of boolean type evaluate to true or false. Joint work with Carlo Angiuli and Todd Wilson. 