Discussion:
NVIDIA Jetson Orin controlled by Prolog
(too old to reply)
Mild Shock
2025-01-03 21:21:19 UTC
Permalink
Hi,

Ok this one is only 250 bucks for a TPU:

Introducing NVIDIA Jetson Orin™ Nano Super


Now I am planning to do the following:

Create a tensor flow Domain Specific Language (DSL).

With these use cases:

- Run the tensor flow DSL locally in
your Prolog system interpreted.

- Run the tensor flow DSL locally in
your Prolog system compiled.

- Run the tensor flow DSL locally on
your Tensor Processing Unit (TPU).

- Run the tensor flow DSL remotely
on a compute server.

- What else?

Maybe also support some ONNX file format?

Bye
Mild Shock
2025-01-03 21:34:08 UTC
Permalink
Hi,

Maybe one can get a better grip of an intimate
relationship, simply by some hands on?

Linear Algebraic Approaches to Logic Programming

Katsumi Inoue (National Institute of Informatics, Japan)

Abstract: Integration of symbolic reasoning and machine
learning is important for robust AI. Realization of
symbolic reasoning based on algebraic methods is promising
to bridge between symbolic reasoning and machine learning,
since algebraic data structures have been used in machine
learning. To this end, Sakama, Inoue and Sato have defined
notable relations between logic programming and linear
algebra and have proposed algorithms to compute logic
programs numerically using tensors. This work has been
extended in various ways, to compute supported and stable
models of normal logic programs, to enhance the efficiency
of computation using sparse methods, and to enable abduction
for abductive logic programming. A common principle in
this approach is to formulate logical formulas as vectors/
matrices/tensors, and linear algebraic operations are
applied on these elements for computation of logic programming.
Partial evaluation can be realized in parallel and by
self-multiplication, showing the potential for exponential
speedup. Furthermore, the idea to represent logic programs
as tensors and matrices and to transform logical reasoning
to numeric computation can be the basis of the differentiable
methods for learning logic programs.

https://www.iclp24.utdallas.edu/invited-speakers/

Bye
Post by Mild Shock
Hi,
Introducing NVIDIA Jetson Orin™ Nano Super
http://youtu.be/S9L2WGf1KrM
Create a tensor flow Domain Specific Language (DSL).
- Run the tensor flow DSL locally in
  your Prolog system interpreted.
- Run the tensor flow DSL locally in
  your Prolog system compiled.
- Run the tensor flow DSL locally on
  your Tensor Processing Unit (TPU).
- Run the tensor flow DSL remotely
  on a compute server.
- What else?
Maybe also support some ONNX file format?
Bye
Mild Shock
2025-01-05 19:22:04 UTC
Permalink
John Sowa shows clear signs of coping problems. We just
have an instance of “The Emperor’s New Clothes” some
companies have become naked with the advent of GPT,

I don’t think it is productive to postulate
Linguists say that LLMs cannot be a language mode'.
- Tensors do not make the linguistic information explicit.
- They do not distinguish the syntax, sernantics, and ontology.
- GPT cannot use the 60• years of Al research and development.

Then in the next slide he embraces tensors for
his new Prolog system nevertheless. WTF! Basically
this is a very narrow narrative, which is totally

unfounded in my opinion. Just check out these papers:

GRIN: GRadient-INformed MoE
[2409.12136] GRIN: GRadient-INformed MoE
https://arxiv.org/abs/2409.12136

A Survey on Mixture of Experts
[2407.06204] A Survey on Mixture of Experts
https://arxiv.org/abs/2407.06204

This paints a totally different picture of LLMs, seems
they are more in the tradition of CYC by Douglas Lenant.
Hi,
Maybe one can get a better grip of an intimate
relationship, simply by some hands on?
Linear Algebraic Approaches to Logic Programming
Katsumi Inoue (National Institute of Informatics, Japan)
Abstract: Integration of symbolic reasoning and machine
learning is important for robust AI.  Realization of
symbolic reasoning based on algebraic methods is promising
to bridge between symbolic reasoning and machine learning,
since algebraic data structures have been used in machine
learning. To this end, Sakama, Inoue and Sato have defined
notable relations between logic programming and linear
algebra and have proposed algorithms to compute logic
programs numerically using tensors.  This work has been
extended in various ways, to compute supported and stable
models of normal logic programs, to enhance the efficiency
of computation using sparse methods, and to enable abduction
for abductive logic programming.  A common principle in
this approach is to formulate logical formulas as vectors/
matrices/tensors, and linear algebraic operations are
applied on these elements for computation of logic programming.
Partial evaluation can be realized in parallel and by
self-multiplication, showing the potential for exponential
speedup.  Furthermore, the idea to represent logic programs
as tensors and matrices and to transform logical reasoning
to numeric computation can be the basis of the differentiable
methods for learning logic programs.
https://www.iclp24.utdallas.edu/invited-speakers/
Bye
Post by Mild Shock
Hi,
Introducing NVIDIA Jetson Orin™ Nano Super
http://youtu.be/S9L2WGf1KrM
Create a tensor flow Domain Specific Language (DSL).
- Run the tensor flow DSL locally in
   your Prolog system interpreted.
- Run the tensor flow DSL locally in
   your Prolog system compiled.
- Run the tensor flow DSL locally on
   your Tensor Processing Unit (TPU).
- Run the tensor flow DSL remotely
   on a compute server.
- What else?
Maybe also support some ONNX file format?
Bye
Mild Shock
2025-01-05 19:22:35 UTC
Permalink
Whats also interesting, the recent physics
nobel price recipient Geoffrey Hinton has also
a like 30 year old paper about MoE,
Adaptive Mixtures of Local Experts
https://www.cs.toronto.edu/~fritz/absps/jjnh91.pdf
John Sowa shows clear signs of coping problems. We just
have an instance of “The Emperor’s New Clothes” some
companies have become naked with the advent of GPT,
I don’t think it is productive to postulate
Linguists say that LLMs cannot be a language mode'.
- Tensors do not make the linguistic information explicit.
- They do not distinguish the syntax, sernantics, and ontology.
- GPT cannot use the 60• years of Al research and development.
http://youtu.be/6K6F_zsQ264
Then in the next slide he embraces tensors for
his new Prolog system nevertheless. WTF! Basically
this is a very narrow narrative, which is totally
GRIN: GRadient-INformed MoE
[2409.12136] GRIN: GRadient-INformed MoE
https://arxiv.org/abs/2409.12136
A Survey on Mixture of Experts
[2407.06204] A Survey on Mixture of Experts
https://arxiv.org/abs/2407.06204
This paints a totally different picture of LLMs, seems
they are more in the tradition of CYC by Douglas Lenant.
Hi,
Maybe one can get a better grip of an intimate
relationship, simply by some hands on?
Linear Algebraic Approaches to Logic Programming
Katsumi Inoue (National Institute of Informatics, Japan)
Abstract: Integration of symbolic reasoning and machine
learning is important for robust AI.  Realization of
symbolic reasoning based on algebraic methods is promising
to bridge between symbolic reasoning and machine learning,
since algebraic data structures have been used in machine
learning. To this end, Sakama, Inoue and Sato have defined
notable relations between logic programming and linear
algebra and have proposed algorithms to compute logic
programs numerically using tensors.  This work has been
extended in various ways, to compute supported and stable
models of normal logic programs, to enhance the efficiency
of computation using sparse methods, and to enable abduction
for abductive logic programming.  A common principle in
this approach is to formulate logical formulas as vectors/
matrices/tensors, and linear algebraic operations are
applied on these elements for computation of logic programming.
Partial evaluation can be realized in parallel and by
self-multiplication, showing the potential for exponential
speedup.  Furthermore, the idea to represent logic programs
as tensors and matrices and to transform logical reasoning
to numeric computation can be the basis of the differentiable
methods for learning logic programs.
https://www.iclp24.utdallas.edu/invited-speakers/
Bye
Post by Mild Shock
Hi,
Introducing NVIDIA Jetson Orin™ Nano Super
http://youtu.be/S9L2WGf1KrM
Create a tensor flow Domain Specific Language (DSL).
- Run the tensor flow DSL locally in
   your Prolog system interpreted.
- Run the tensor flow DSL locally in
   your Prolog system compiled.
- Run the tensor flow DSL locally on
   your Tensor Processing Unit (TPU).
- Run the tensor flow DSL remotely
   on a compute server.
- What else?
Maybe also support some ONNX file format?
Bye
Mild Shock
2025-01-05 20:42:53 UTC
Permalink
Douglas Lenat died two years ago in
August 31, 2023. I don’t know whether
CYC and Cycorp will make a dent in
the future. CYC adressed the common

knowledge bottleneck, and so do LLM. I
am using CYC mainly as a historical reference.
The “common knowledge bottleneck” in AI is
a challenge that plagued early AI systems.
This bottleneck stems from the difficulty

of encoding vast amounts of everyday,
implicit human knowledge things we take for
granted but computers historically struggled
to understand. Currently LLM by design focus
more on shallow

knowledge, whereas systems such as CYC might
exhibit more deep knowlege in certain domains,
making them possibly more suitable when the
stakeholders expect more reliable
analytic capabilities.

The problem is not explainability,
the problem is intelligence.
Mild Shock
2025-01-05 20:46:33 UTC
Permalink
Notice John Sowa calls LLM the “store”
of GPT. This could be a misconception that
matches what Permion did for their cognitive memory.
But matters are a little bit more complicated
to say the least, especially

since OpenAI insists that GPT itself is also
an LLM. What might highlight the situation is
Fig 6 of this paper, postulating two Mixture of
Experts (MoE), one on attention mechanism and
one on feed-forward:

A Survey on Mixture of Experts
[2407.06204] A Survey on Mixture of Experts
https://arxiv.org/abs/2407.06204

Disclaimer: Pitty Marvin Minksy didn’t describe
these things already in his society of mind!
Would make it easier to understand it now…
Post by Mild Shock
Douglas Lenat died two years ago in
August 31, 2023. I don’t know whether
CYC and Cycorp will make a dent in
the future. CYC adressed the common
knowledge bottleneck, and so do LLM. I
am using CYC mainly as a historical reference.
The “common knowledge bottleneck” in AI is
a challenge that plagued early AI systems.
This bottleneck stems from the difficulty
of encoding vast amounts of everyday,
implicit human knowledge things we take for
granted but computers historically struggled
to understand. Currently LLM by design focus
more on shallow
knowledge, whereas systems such as CYC might
exhibit more deep knowlege in certain domains,
making them possibly more suitable when the
stakeholders expect more reliable
analytic capabilities.
The problem is not explainability,
the problem is intelligence.
Loading...