John Sowa shows clear signs of coping problems. We just
Linguists say that LLMs cannot be a language mode'.
- Tensors do not make the linguistic information explicit.
- They do not distinguish the syntax, sernantics, and ontology.
- GPT cannot use the 60• years of Al research and development.
his new Prolog system nevertheless. WTF! Basically
unfounded in my opinion. Just check out these papers:
they are more in the tradition of CYC by Douglas Lenant.
Hi,
Maybe one can get a better grip of an intimate
relationship, simply by some hands on?
Linear Algebraic Approaches to Logic Programming
Katsumi Inoue (National Institute of Informatics, Japan)
Abstract: Integration of symbolic reasoning and machine
learning is important for robust AI. Realization of
symbolic reasoning based on algebraic methods is promising
to bridge between symbolic reasoning and machine learning,
since algebraic data structures have been used in machine
learning. To this end, Sakama, Inoue and Sato have defined
notable relations between logic programming and linear
algebra and have proposed algorithms to compute logic
programs numerically using tensors. This work has been
extended in various ways, to compute supported and stable
models of normal logic programs, to enhance the efficiency
of computation using sparse methods, and to enable abduction
for abductive logic programming. A common principle in
this approach is to formulate logical formulas as vectors/
matrices/tensors, and linear algebraic operations are
applied on these elements for computation of logic programming.
Partial evaluation can be realized in parallel and by
self-multiplication, showing the potential for exponential
speedup. Furthermore, the idea to represent logic programs
as tensors and matrices and to transform logical reasoning
to numeric computation can be the basis of the differentiable
methods for learning logic programs.
https://www.iclp24.utdallas.edu/invited-speakers/
Bye
Post by Mild ShockHi,
Introducing NVIDIA Jetson Orin™ Nano Super
http://youtu.be/S9L2WGf1KrM
Create a tensor flow Domain Specific Language (DSL).
- Run the tensor flow DSL locally in
your Prolog system interpreted.
- Run the tensor flow DSL locally in
your Prolog system compiled.
- Run the tensor flow DSL locally on
your Tensor Processing Unit (TPU).
- Run the tensor flow DSL remotely
on a compute server.
- What else?
Maybe also support some ONNX file format?
Bye