• 2026 the Year Prolog **plonked** Itself [Strawberry Prolog is back]

    From Mild Shock@janburse@fastmail.fm to sci.logic on Wed Feb 4 02:21:40 2026
    From Newsgroup: sci.logic

    Hi,

    Just a quick take on the Prolog pulse:

    - SWI-Prolog: Dead Horse, LISP / MeTTa is it.

    - Scyer: Dead Horse, no core commits

    - Ciao Prolog: Dead Horse, no core commits

    - Who else?

    So I guess its back to Strawberry Prolog for teaching!

    They are also building AGI:

    https://dobrev.com/AI/Think_in_mind.pdf

    Bye
    --- Synchronet 3.21b-Linux NewsLink 1.2
  • From Mild Shock@janburse@fastmail.fm to sci.logic on Wed Feb 4 11:19:44 2026
    From Newsgroup: sci.logic

    Hi,

    Spies and Playboys: Don't use insecure E-mail.
    Its too much laugh for everybody:

    A woman's history of vaginal orgasm is discernible from her walk. https://www.jmail.world/thread/EFTA02421406?view=inbox

    Cudos to whoever did that(*), it has even PDF
    backlings, like this here;

    A woman's history of vaginal orgasm is discernible from her walk https://assets.getkino.com/documents/EFTA02421406.pdf

    P.S.: Who did it:

    The Jmail Suite
    https://www.jmail.world/about#built-by

    P.P.S.: Eurpeans be careful using USA E-mail.

    Mild Shock schrieb:
    Hi,

    Just a quick take on the Prolog pulse:

    - SWI-Prolog: Dead Horse, LISP / MeTTa is it.

    - Scyer: Dead Horse, no core commits

    - Ciao Prolog: Dead Horse, no core commits

    - Who else?

    So I guess its back to Strawberry Prolog for teaching!

    They are also building AGI:

    https://dobrev.com/AI/Think_in_mind.pdf

    Bye

    --- Synchronet 3.21b-Linux NewsLink 1.2
  • From Tristan Wibberley@tristan.wibberley+netnews2@alumni.manchester.ac.uk to sci.logic on Wed Feb 4 23:03:11 2026
    From Newsgroup: sci.logic

    On 04/02/2026 01:21, Mild Shock wrote:
    Hi,

    Just a quick take on the Prolog pulse:

    - SWI-Prolog: Dead Horse, LISP / MeTTa is it.

    - Scyer: Dead Horse, no core commits

    - Ciao Prolog: Dead Horse, no core commits

    - Who else?

    Why is commits relevant? That just goes with the rate of /change/.
    --
    Tristan Wibberley

    The message body is Copyright (C) 2026 Tristan Wibberley except
    citations and quotations noted. All Rights Reserved except that you may,
    of course, cite it academically giving credit to me, distribute it
    verbatim as part of a usenet system or its archives, and use it to
    promote my greatness and general superiority without misrepresentation
    of my opinions other than my opinion of my greatness and general
    superiority which you _may_ misrepresent. You definitely MAY NOT train
    any production AI system with it but you may train experimental AI that
    will only be used for evaluation of the AI methods it implements.

    --- Synchronet 3.21b-Linux NewsLink 1.2
  • From olcott@polcott333@gmail.com to sci.logic on Wed Feb 4 17:25:40 2026
    From Newsgroup: sci.logic

    On 2/4/2026 5:03 PM, Tristan Wibberley wrote:
    On 04/02/2026 01:21, Mild Shock wrote:
    Hi,

    Just a quick take on the Prolog pulse:

    - SWI-Prolog: Dead Horse, LISP / MeTTa is it.

    - Scyer: Dead Horse, no core commits

    - Ciao Prolog: Dead Horse, no core commits

    - Who else?

    Why is commits relevant? That just goes with the rate of /change/.


    Exactly. Hasn't been changed recently could
    just mean reached a sufficient degree of perfection.
    --
    Copyright 2026 Olcott<br><br>

    My 28 year goal has been to make <br>
    "true on the basis of meaning expressed in language"<br>
    reliably computable for the entire body of knowledge.<br><br>

    This required establishing a new foundation<br>
    --- Synchronet 3.21b-Linux NewsLink 1.2
  • From Mild Shock@janburse@fastmail.fm to sci.logic on Thu Feb 5 09:08:00 2026
    From Newsgroup: sci.logic

    Well if you say so, we can correct it:

    - XXX Prolog: Perfectly Dead Horse

    LoL

    There are different options now:

    1. Buying a stronger whip.
    2. Changing riders.
    3. Threatening the horse with termination.
    4. Appointing a committee to study the horse.
    5. Arranging to visit other countries to see how
    others ride dead horses.
    6. Lowering the standards so that dead horses
    can be included.
    7. Re-classifying the dead horse as "living
    impaired".
    8. Hiring outside contractors to ride the dead
    horse.
    9. Harnessing several dead horses together to
    increase the speed.
    10. Providing additional funding and/or
    training to increase the dead horse's
    performance.
    11. Doing a productivity study to see if lighter
    riders would improve the dead horse's
    performance.
    12. Declaring that as the dead horse does not
    have to be fed, it is less costly, carries lower
    overhead, and therefore contributes
    substantially more to the bottom line of the
    economy than do some other horses.
    13. Re-writing the expected performance
    requirements for all horses.
    14. Promoting the dead horse to a supervisory
    position of hiring another horse.

    https://www.chrisdunnconsulting.co.uk/flogging-a-dead-horse/

    olcott schrieb:
    On 2/4/2026 5:03 PM, Tristan Wibberley wrote:
    On 04/02/2026 01:21, Mild Shock wrote:
    Hi,

    Just a quick take on the Prolog pulse:

    - SWI-Prolog: Dead Horse, LISP / MeTTa is it.

    - Scyer: Dead Horse, no core commits

    - Ciao Prolog: Dead Horse, no core commits

    - Who else?

    Why is commits relevant? That just goes with the rate of /change/.


    Exactly. Hasn't been changed recently could
    just mean reached a sufficient degree of perfection.



    --- Synchronet 3.21b-Linux NewsLink 1.2
  • From Mild Shock@janburse@fastmail.fm to sci.logic on Thu Feb 5 09:54:52 2026
    From Newsgroup: sci.logic


    Hi,

    How it started:

    TPTP World Anthem [The TPTP World Needs Money] https://drive.google.com/file/d/1otJ8FxCiwVTIMoEDMW98puuTKIUH7BYF/view?pli=1

    How its going:

    ANTHEM 2.0: Automated Reasoning for Answer Set Programming https://www.cambridge.org/core/journals/theory-and-practice-of-logic-programming/article/anthem-20-automated-reasoning-for-answer-set-programming/69D4B6430617A334906A99F787FA2727

    Bye

    P.S.: Would be fine if the TPTP sublanguages, were
    lean, just some operator definitions, and not some
    bloathed nonsense. Going such a Lean path would

    allow to stay in the ISO core standard, instead of
    inventing one new language after the other, Gringo,
    who knows what, with horrible eval semantics of

    (=)/2 and silly (=>)/2 rules for example in Picat.
    Also translating ASP to FOL, when there are no
    cardinality heads, is rather trivial.

    Mild Shock schrieb:
    Hi,

    Just a quick take on the Prolog pulse:

    - SWI-Prolog: Dead Horse, LISP / MeTTa is it.

    - Scyer: Dead Horse, no core commits

    - Ciao Prolog: Dead Horse, no core commits

    - Who else?

    So I guess its back to Strawberry Prolog for teaching!

    They are also building AGI:

    https://dobrev.com/AI/Think_in_mind.pdf

    Bye

    --- Synchronet 3.21b-Linux NewsLink 1.2
  • From Mild Shock@janburse@fastmail.fm to sci.logic on Sat Feb 14 09:55:34 2026
    From Newsgroup: sci.logic

    Hi,

    Just a quick take on the golang pulse:

    How it started:

    A Comparative Study of Language Implementations https://www.arxiv.org/abs/2502.01651

    How its going:

    Co-Creator of Go Language is Rightly Furious https://itsfoss.com/news/rob-pike-furious/

    Rob Pike going bonkers over AI.

    Bye

    See especially, ca factor 4x-5x behind:

    Go < Julia < Rust < C < Zig < Mojo < C++
    (a): Average tokens per second for stories15M.bin model

    Go < Julia < Rust < Zig < C < C++ < Mojo
    (c): Average tokens per second for stories42M.bin model

    So I guess its back to Fortran.

    Mild Shock schrieb:
    Hi,

    Just a quick take on the Prolog pulse:

    - SWI-Prolog: Dead Horse, LISP / MeTTa is it.

    - Scyer: Dead Horse, no core commits

    - Ciao Prolog: Dead Horse, no core commits

    - Who else?

    So I guess its back to Strawberry Prolog for teaching!

    They are also building AGI:

    https://dobrev.com/AI/Think_in_mind.pdf

    Bye

    --- Synchronet 3.21b-Linux NewsLink 1.2
  • From Mild Shock@janburse@fastmail.fm to sci.logic on Sat Feb 14 12:52:53 2026
    From Newsgroup: sci.logic

    Hi,

    So the Rob Pike account shows the full
    extend how people become schizophrenice
    or get amnesia in the age of the AI Boom.

    How it started:

    "Kernighan and Pike listed Mark V. Shaney
    in the acknowledgements in The Practice
    of Programming,[12] noting its roots in
    Mitchell's markov, which, adapted as shaney,
    [13] was used for "humorous deconstructionist
    activities" in the 1980s.
    https://en.wikipedia.org/wiki/Mark_V._Shaney

    How its going:

    "Fuck you people. Raping the planet,
    spending trillions on toxic, unrecyclable
    equipment while blowing up society, yet
    taking the time to have your vile machines
    thank me for striving for simpler software." https://bsky.app/profile/robpike.io/post/3matwg6w3ic2s

    Or it was one of the many academic frauds,
    that Rob Pike even appeared talking about
    the earl AI bot, to get false credits?

    Bye

    P.S.: Has somebody seen Julio Di Egidio recently?
    He posted the same vile machines stance on
    SWI-Prolog shortly between 2025 and 2026,

    got probably banned and disappeared [forever?].
    Well good ridance that he *plonked* himself,
    fucking moron who didn't understand

    rational trees, yet harassed me.

    Mild Shock schrieb:
    Hi,

    Just a quick take on the golang pulse:

    How it started:

    A Comparative Study of Language Implementations https://www.arxiv.org/abs/2502.01651

    How its going:

    Co-Creator of Go Language is Rightly Furious https://itsfoss.com/news/rob-pike-furious/

    Rob Pike going bonkers over AI.

    Bye

    See especially, ca factor 4x-5x behind:

    Go < Julia < Rust < C < Zig < Mojo < C++
    (a): Average tokens per second for stories15M.bin model

    Go < Julia < Rust < Zig < C < C++ < Mojo
    (c): Average tokens per second for stories42M.bin model

    So I guess its back to Fortran.

    Mild Shock schrieb:
    Hi,

    Just a quick take on the Prolog pulse:

    - SWI-Prolog: Dead Horse, LISP / MeTTa is it.

    - Scyer: Dead Horse, no core commits

    - Ciao Prolog: Dead Horse, no core commits

    - Who else?

    So I guess its back to Strawberry Prolog for teaching!

    They are also building AGI:

    https://dobrev.com/AI/Think_in_mind.pdf

    Bye


    --- Synchronet 3.21b-Linux NewsLink 1.2
  • From Mild Shock@janburse@fastmail.fm to sci.logic on Tue Feb 17 11:38:07 2026
    From Newsgroup: sci.logic


    Hi,

    Geoffrey E. Hinton, the Nobel Prize winner
    for AI. He was already beating the drums
    for ReLU in 2010:

    HRectified Linear Units Improve Restricted Boltzmann Machines
    Geoffrey E. Hinton & Vinod Nair - 2010 https://www.cs.toronto.edu/~fritz/absps/reluICML.pdf

    Because ANNs (Artificial Neural Networks) were originally
    designed with other functions, e.g. with Logistic function:

    An artificial neuron is a mathematical function conceived
    as a model of a biological neuron in a neural network. https://en.wikipedia.org/wiki/Artificial_neuron

    If you populate additive factor graphs with log P,
    you basically get multiplicative factor graphs.
    So an ANN can express belief networks, right?

    Bye

    P.S.: What is all the hype about Causal AI, and
    the Ladder of Causation |a la Judea Pearl?

    Causal AI rCo the next gen AI
    Prof. Sotirios A. Tsaftaris - 2025
    https://www.youtube.com/watch?v=IelslFzdsYw
    --- Synchronet 3.21b-Linux NewsLink 1.2
  • From Mild Shock@janburse@fastmail.fm to sci.logic on Thu Feb 19 20:01:46 2026
    From Newsgroup: sci.logic

    Hi,

    Looks like the Causal AI viewpoint on neural
    networks makes people comfortable. Some
    work from Pittsburgh:

    Causal Representation Learning and Generative AI
    by Dr Kun Zhang #CausalNeSyAI
    https://www.youtube.com/watch?v=UwcitNXphog

    Kathleen M. Carley: Understanding Influence
    A Network Science + AI Approach rCo IC2S2 2025 Keynote https://www.youtube.com/watch?v=iHjFXtIeXnw

    So Gen Alpha will grow up with Causal AI,
    while Gen Z just botched Scryer Prolog, its
    definitively a dead horse now?

    Bye

    Mild Shock schrieb:

    Hi,

    Geoffrey E. Hinton, the Nobel Prize winner
    for AI. He was already beating the drums
    for ReLU in 2010:

    HRectified Linear Units Improve Restricted Boltzmann Machines
    Geoffrey E. Hinton & Vinod Nair - 2010 https://www.cs.toronto.edu/~fritz/absps/reluICML.pdf

    Because ANNs (Artificial Neural Networks) were originally
    designed with other functions, e.g. with Logistic function:

    An artificial neuron is a mathematical function conceived
    as a model of a biological neuron in a neural network. https://en.wikipedia.org/wiki/Artificial_neuron

    If you populate additive factor graphs with log P,
    you basically get multiplicative factor graphs.
    So an ANN can express belief networks, right?

    Bye

    P.S.: What is all the hype about Causal AI, and
    the Ladder of Causation |a la Judea Pearl?

    Causal AI rCo the next gen AI
    Prof. Sotirios A. Tsaftaris - 2025 https://www.youtube.com/watch?v=IelslFzdsYw

    --- Synchronet 3.21b-Linux NewsLink 1.2
  • From Mild Shock@janburse@fastmail.fm to sci.logic on Fri Feb 27 10:23:20 2026
    From Newsgroup: sci.logic

    Hi,

    Large Reasoning Models (LRM) seem to move from
    Foundation of Mathematics (FOM) to Theoretical
    Computer Science (TCS). FOM typically gives

    you "white science" mathematics, with sets and
    infinity, if you are lucky a little recursion
    theory. Fun fact TCS is even more "white".

    Interesting paper in as far:

    Lean Meets Theoretical Computer Science:
    Scalable Synthesis of Theorem Proving Challenges
    in Formal-Informal Pairs
    Terry Jingchen Zhang et. al. - 2025
    https://arxiv.org/abs/2508.15878v1

    One swallow does not make a summer?

    But its probably a necessary step. The above
    paper using Busy Beaver and Interger Constraints
    as examples. What logical frameworks do even

    apply, is it enough to have a "total function"
    theory layer, or does TCS need more. TCS can
    be heavy on all sort of discrete and

    non-discrete mathematics.

    Bye

    Mild Shock schrieb:
    Hi,

    Geoffrey E. Hinton, the Nobel Prize winner
    for AI. He was already beating the drums
    for ReLU in 2010:

    HRectified Linear Units Improve Restricted Boltzmann Machines
    Geoffrey E. Hinton & Vinod Nair - 2010 https://www.cs.toronto.edu/~fritz/absps/reluICML.pdf

    Because ANNs (Artificial Neural Networks) were originally
    designed with other functions, e.g. with Logistic function:

    An artificial neuron is a mathematical function conceived
    as a model of a biological neuron in a neural network. https://en.wikipedia.org/wiki/Artificial_neuron

    If you populate additive factor graphs with log P,
    you basically get multiplicative factor graphs.
    So an ANN can express belief networks, right?

    Bye

    P.S.: What is all the hype about Causal AI, and
    the Ladder of Causation |a la Judea Pearl?

    Causal AI rCo the next gen AI
    Prof. Sotirios A. Tsaftaris - 2025 https://www.youtube.com/watch?v=IelslFzdsYw

    --- Synchronet 3.21b-Linux NewsLink 1.2
  • From Mild Shock@janburse@fastmail.fm to sci.logic on Sat Feb 28 16:38:18 2026
    From Newsgroup: sci.logic

    Hi,

    Now "white science" might have more leaning
    towards Causal AI. But sometimes I have the
    feeling Generative AI is the "yellow science"

    now. DeepSeek left handedly inventing the
    notation M(a,b) for matrix representation of
    a dual number a + b e:

    M(a,b) = a I + b E

    And then discussing variants like:

    E = [[ 0 1 ]
    [ 0 0 ]]

    Or this variant like:

    E = [[ 1 -1 ]
    [ 1 -1 ]]

    I am in mild shock! Then I am reading the small print:

    Prof. Cai Wen, Institute of Extenics and Innovative Methods, Guangzhou, https://fs.unm.edu/DualNumbers.pdf

    What is Extenics? Any ideas?

    Have Fun!

    Bye

    Mild Shock schrieb:
    Hi,

    Large Reasoning Models (LRM) seem to move from
    Foundation of Mathematics (FOM) to Theoretical
    Computer Science (TCS). FOM typically gives

    you "white science" mathematics, with sets and
    infinity, if you are lucky a little recursion
    theory. Fun fact TCS is even more "white".

    Interesting paper in as far:

    Lean Meets Theoretical Computer Science:
    Scalable Synthesis of Theorem Proving Challenges
    in Formal-Informal Pairs
    Terry Jingchen Zhang et. al. - 2025
    https://arxiv.org/abs/2508.15878v1

    One swallow does not make a summer?

    But its probably a necessary step. The above
    paper using Busy Beaver and Interger Constraints
    as examples. What logical frameworks do even

    apply, is it enough to have a "total function"
    theory layer, or does TCS need more. TCS can
    be heavy on all sort of discrete and

    non-discrete mathematics.

    Bye

    Mild Shock schrieb:
    Hi,

    Geoffrey E. Hinton, the Nobel Prize winner
    for AI. He was already beating the drums
    for ReLU in 2010:

    HRectified Linear Units Improve Restricted Boltzmann Machines
    Geoffrey E. Hinton & Vinod Nair - 2010 https://www.cs.toronto.edu/~fritz/absps/reluICML.pdf

    Because ANNs (Artificial Neural Networks) were originally
    designed with other functions, e.g. with Logistic function:

    An artificial neuron is a mathematical function conceived
    as a model of a biological neuron in a neural network. https://en.wikipedia.org/wiki/Artificial_neuron

    If you populate additive factor graphs with log P,
    you basically get multiplicative factor graphs.
    So an ANN can express belief networks, right?

    Bye

    P.S.: What is all the hype about Causal AI, and
    the Ladder of Causation |a la Judea Pearl?

    Causal AI rCo the next gen AI
    Prof. Sotirios A. Tsaftaris - 2025 https://www.youtube.com/watch?v=IelslFzdsYw


    --- Synchronet 3.21d-Linux NewsLink 1.2