Hi,
Large Reasoning Models (LRM) seem to move from
Foundation of Mathematics (FOM) to Theoretical
Computer Science (TCS). FOM typically gives
you "white science" mathematics, with sets and
infinity, if you are lucky a little recursion
theory. Fun fact TCS is even more "white".
Interesting paper in as far:
Lean Meets Theoretical Computer Science:
Scalable Synthesis of Theorem Proving Challenges
in Formal-Informal Pairs
Terry Jingchen Zhang et. al. - 2025
https://arxiv.org/abs/2508.15878v1
One swallow does not make a summer?
But its probably a necessary step. The above
paper using Busy Beaver and Interger Constraints
as examples. What logical frameworks do even
apply, is it enough to have a "total function"
theory layer, or does TCS need more. TCS can
be heavy on all sort of discrete and
non-discrete mathematics.
Bye
Mild Shock schrieb:
Hi,
Geoffrey E. Hinton, the Nobel Prize winner
for AI. He was already beating the drums
for ReLU in 2010:
HRectified Linear Units Improve Restricted Boltzmann Machines
Geoffrey E. Hinton & Vinod Nair - 2010 https://www.cs.toronto.edu/~fritz/absps/reluICML.pdf
Because ANNs (Artificial Neural Networks) were originally
designed with other functions, e.g. with Logistic function:
An artificial neuron is a mathematical function conceived
as a model of a biological neuron in a neural network. https://en.wikipedia.org/wiki/Artificial_neuron
If you populate additive factor graphs with log P,
you basically get multiplicative factor graphs.
So an ANN can express belief networks, right?
Bye
P.S.: What is all the hype about Causal AI, and
the Ladder of Causation |a la Judea Pearl?
Causal AI rCo the next gen AI
Prof. Sotirios A. Tsaftaris - 2025 https://www.youtube.com/watch?v=IelslFzdsYw
| Sysop: | Amessyroom |
|---|---|
| Location: | Fayetteville, NC |
| Users: | 59 |
| Nodes: | 6 (0 / 6) |
| Uptime: | 24:11:00 |
| Calls: | 810 |
| Calls today: | 1 |
| Files: | 1,287 |
| D/L today: |
12 files (21,036K bytes) |
| Messages: | 195,978 |