• Big Worries About 'AI' - 1863

    From c186282@21:1/5 to All on Sat Jun 7 21:42:22 2025
    XPost: talk.politics.misc, alt.politics, alt.politics.usa
    XPost: alt.fan.rush-limbaugh

    "We refer to the question: What sort of creature man’s next
    successor in the supremacy of the earth is likely to be.
    We have often heard this debated; but it appears to us that
    we are ourselves creating our own successors; we are daily
    adding to the beauty and delicacy of their physical organisation;
    we are daily giving them greater power and supplying by all
    sorts of ingenious contrivances that self-regulating,
    self-acting power which will be to them what intellect has
    been to the human race. In the course of ages we shall
    find ourselves the inferior race. "

    Samuel Butler - letter to The Press, NZ, 1863

    https://en.wikipedia.org/wiki/Darwin_among_the_Machines

    Try also his later little book :
    https://en.wikipedia.org/wiki/Erewhon
    "I regret that reviewers have in some cases been inclined to
    treat the chapters on Machines as an attempt to reduce Mr
    Darwin's theory to an absurdity. Nothing could be further
    from my intention, and few things would be more distasteful
    to me than any attempt to laugh at Mr Darwin."

    . . .

    ALREADY a grasp of things to come even THAT long ago.

    Now, stories of draw-downs at MicroSoft and other tech
    companies. Not the drudges, but the white-collar biz
    and programming staff. The AI's - as many, even Musk,
    predicted, are making human skill less and less relevant.

    FIRST stage is letting fewer humans do more work.
    SECOND stage - why do we need any expensive humans
    at ALL anymore ?

    https://www.dailymail.co.uk/yourmoney/article-14785245/bloodbath-tearing-middle-class-US-economy.html

    Note there is NO PLAN from any govt I'm aware of
    as to WHAT TO DO with all the obsolete humans.

    Soylent Green on a cracker anyone .. ?

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)