Hello! I am Assistant Professor of Neurosymbolic AI in the Institute of Logic, Langauge, and Computation, University of Amsterdam.
Quick links: Google Scholar; DBLP
How are humans able to combine concepts to form new ones? Why do unexpected or ‘emergent’ attributes arise in these combinations? I use computational linguistics, conceptual spaces theory, quantum theory, and category theory to try and answer questions like these.
I was most recently a Lecturer in the School of Engineering Mathematics and Technology at Bristol University. Before Bristol, I held a Veni fellowship at the ILLC at the University of Amsterdam. I was a postdoc in the Quantum Group in the Department of Computer Science, University of Oxford, under the project ‘Algorithmic and Logical Aspects of Meaning’ funded by AFOSR. I did my PhD at the University of Bristol, in the Bristol Centre for Complexity Sciences, and before that the Evolutionary and Adaptive Systems (EASy) MSc at the University of Sussex.
Research Interests
I use interdisciplinary approaches to modelling language and concepts. My research splits into three broad areas - compositional approaches to language and meaning, applications of quantum theory in modelling language, and evolutionary approaches to language.
Compositional approaches
Humans are able to generate and to understand completely novel combinations of concepts. How can we do this? I look at ways to integrate symbolic composition with statistical or fuzzy representations of concepts. Examples are: integrating grammar with conceptual spaces, a hierarchical approach to concept composition, and logical structure in vector spaces
I have also recently started working on metaphor and my student Xiaoyu Tong developed a comprehensive metaphor paraphrase dataset which we are currently extending.
Applications of quantum theory
The compositional framework I work in has its roots in quantum theory, and there is a wide range of work that examines quantum approaches in cognitive science. I provide a thorough review of this research. Density matrices are a notion taken from quantum theory that I use to model hyponymy and entailment within a compositional vector-based model of meaning. The same framework is used to model lexical ambiguity where we show how to learn density matrix representations of words from a corpus and how these representations disambiguate in the process of composition.
Evolutionary approaches
A key aspect of human concept use is that concepts and words evolve over time. I have examined how shared concepts can emerge in a community of artificial agents, and the impact of linguistic hedges and concept conjunction on the resulting concepts.
Work in progress: integrating a compositional vector-based model of meaning with an evolutionary model….. watch this space!