Applications‎ > ‎


One of the deepest questions in Linguistics is how the meaning of a sentence is obtained from the specific meaning of each word. We observe that meaning of words is some times ambiguous, thus the meaning of the sentence is obtained from the adaptation of the meanings of each word to the meaning of other sentences. However, it may be the case that when two words are combined certain meanings may emerge, i.e. the meaning of the word combination is not deducible from the meaning of the former words. This phenomena is called word non-compositionality.

Early approaches to meaning of sentences that are rooted in classical logic, assume that meaning is compositional, i.e. we can build the meaning of a sentence by only considering the meaning of the forming words. However, it has been shown recently that meaning is not compositional in general. Indeed, it has been proven that no model rooted in classical logic or classical probability could describe the emergence of meaning in non-compositional word combinations.

Quantum-Like Models

Moreover, it has been shown that a quantum-like model of word combinations is suitable to explain the emergence of meaning in word combinations. Words are modeled as vectors of a Hilbert Space, and word combinations are modeled as elements of the Tensor product of the spaces corresponding to each word. This procedure is analogous to the modeling of joined quantum entities, and the meaning emergence is analogous to the emergence of bonded states for the joined entity. 

Bruza, K. Kitto, B. Ramm, L.  Sitbon, S. Blomberg, D. Song, Quantum-like non-separability of concept combinations, emergent associates and abduction.  Logic Journal of the IGPL  (in press), 2011.

P. Bruza, Concept combination, emergence and abduction. Proceedings 2010 International Conference on Information Retrieval and Knowledge Management, 1-5, 2010.

Compositional Vector Semantics More Generally

Recent years have seen an accelerating growth in research on compositional distributional semantics. Much of this work uses vector models and many of the operators whose uses are well-established in quantum theory, though the mathematics is often of such a general nature that the "quantum" nature of these models is open to very reasonable questions.

Some of the earliest work in this area was in artificial intelligence, in particular the use of tensor products for composition. 

More recently, computational linguists have made progress in this area, using empirical observations of natural language corpora to build models.

There are by now whole projects, seminars and conferences on distributional compositional semantics. These include: