This workshop will take place on Friday, 8 January 2021, from 08:00 am to 12:10 pm (UTC). It will be an online event. We are currently in the process of preparing the proceedings.
08:00 - 08:10 | Opening and Welcome |
08:10 - 09:00 | Invited Talk by Michael Spranger (more details below) |
Logic Tensor Network - A next generation framework for Neural-Symbolic Computing | |
09:00 - 09:30 | Christian Chiarcos, Thierry Declerck and Maxim Ionov |
Embeddings for the Lexicon: Modelling and Representation | |
09:30 - 10:00 | Coffee break |
10:00 - 10:20 | Anna Breit |
Introduction to the Target Sense Verification for Word in Context (WiC-TSV) Challenge | |
10:20 - 10:50 | Jose Moreno, Elvys Linhares Pontes and Gaƫl Dias |
CTLR@WiC-TSV: Target Sense Verification using Marked Inputs and Pre-trained Models | |
10:50 - 11:10 | Pierre-Yves Vandenbussche, Tony Scerri and Ron Daniel Jr. |
Word Sense Disambiguation with Transformer Models | |
11:10 - 11:40 | Alireza Mohammadshahi and James Henderson |
Graph-to-Graph Transformer for Transition-based Dependency Parsing (Findings of EMNLP paper) | |
11:40 - 12:10 | Jose Moreno, Antoine Doucet and Brigitte Grau |
Relation Classification via Relation Validation |
Invited talk: "Logic Tensor Network - A next generation framework for Neural-Symbolic Computing"
Abstract. Artificial Intelligence has long been characterized by various approaches to intelligence. Some researchers have focussed on symbolic reasoning, while others have had important successes using learning from data. While state-of-the-art learning from data typically use sub-symbolic distributed representations, reasoning is normally useful at a higher level of abstraction with the use of a first-order logic language for knowledge representation. However, this dichotomy may actually be detrimental to progress in the field. Consequently, there has been a growing interest in neural-symbolic integration. In the talk I will present Logic Tensor Networks (LTN) a recently revised neural-symbolic formalism that supports learning and reasoning through the introduction of a many-valued, end-to-end differentiable first-order logic, called Real Logic. The talk will introduce LTN using examples that combine learning and reasoning in areas as diverse as: data clustering, multi-label classification, relational learning, logical reasoning, query answering, semi-supervised learning, regression and learning of embeddings.