In this chapter, we provide an overview of Logic Tensor Networks (LTNs, for short), a formalism that makes use of tensor embeddings—n-dimensional vector representations—of elements tied to a logical syntax, which has seen traction in NSR literature in the past few years. After briefly recalling Real Logic, the underlying language of LTNs, we discuss the representation of different kinds of knowledge in formalism, the three main tasks that can be addressed with them (learning, reasoning, and query answering), and finally, describe several use cases that have shown the usefulness of LTNs in many tasks that are central to the construction of intelligent systems.
LTN: Logic Tensor Networks
Simari G. I.;
2023-01-01
Abstract
In this chapter, we provide an overview of Logic Tensor Networks (LTNs, for short), a formalism that makes use of tensor embeddings—n-dimensional vector representations—of elements tied to a logical syntax, which has seen traction in NSR literature in the past few years. After briefly recalling Real Logic, the underlying language of LTNs, we discuss the representation of different kinds of knowledge in formalism, the three main tasks that can be addressed with them (learning, reasoning, and query answering), and finally, describe several use cases that have shown the usefulness of LTNs in many tasks that are central to the construction of intelligent systems.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


