• Re: Rewriting SSA. Is This A Chance For GNU/Linux?

    From Richard Kettlewell@21:1/5 to sc@fiat-linux.fr on Sun Apr 6 10:14:43 2025
    Stéphane CARPENTIER <sc@fiat-linux.fr> writes:
    Le 03-04-2025, c186282 <c186282@nnada.net> a écrit :

    It's even worse now, seriously worse. Means nobody
    becomes "experts" in the usual sense of the word.

    I don't know why it's like that in the entire world, but in France, the reason is obvious. The company refuse to take into account technical
    skills. If you want to increase your salary, you have to switch to management. So, as nobody wants to become the most important guy in the company with the lowest salary, there is no more experts.

    I think the issue is general, and so are the exceptions to it. On the
    one hand I’ve heard similar complaints about UK and US employers. On the other hand when we were owned by a French company they were very clear
    about their grade structure branching into management and technical
    tracks past a certain point, and they did actually implement this; at no
    point did they make the mistake of asking me to do any line management.

    And most importantly, things evolved very fast, so if you become an
    expert on something which disappear, you switch very fast from very
    required guy to useless guy. So, before becoming an expert, you need to
    be sure your skills will stay useful until you retire. Which is
    difficult if you are young.

    Specializing in the wrong thing is certainly a risk. For any given
    technology it’s useful to be able to look past what its boosters (and detractors) say about it to whether it does anything useful in reality.

    --
    https://www.greenend.org.uk/rjk/

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From =?UTF-8?Q?St=C3=A9phane?= CARPENTIE@21:1/5 to All on Sun Apr 6 08:52:37 2025
    Le 03-04-2025, c186282 <c186282@nnada.net> a écrit :

    It's even worse now, seriously worse. Means nobody
    becomes "experts" in the usual sense of the word.

    I don't know why it's like that in the entire world, but in France, the
    reason is obvious. The company refuse to take into account technical
    skills. If you want to increase your salary, you have to switch to
    management. So, as nobody wants to become the most important guy in the
    company with the lowest salary, there is no more experts.

    And most importantly, things evolved very fast, so if you become an
    expert on something which disappear, you switch very fast from very
    required guy to useless guy. So, before becoming an expert, you need to
    be sure your skills will stay useful until you retire. Which is
    difficult if you are young.

    --
    Si vous avez du temps à perdre :
    https://scarpet42.gitlab.io

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From =?UTF-8?Q?St=C3=A9phane?= CARPENTIE@21:1/5 to All on Sun Apr 6 11:13:49 2025
    Le 06-04-2025, Richard Kettlewell <invalid@invalid.invalid> a écrit :
    Stéphane CARPENTIER <sc@fiat-linux.fr> writes:
    Le 03-04-2025, c186282 <c186282@nnada.net> a écrit :

    It's even worse now, seriously worse. Means nobody
    becomes "experts" in the usual sense of the word.

    I don't know why it's like that in the entire world, but in France, the
    reason is obvious. The company refuse to take into account technical
    skills. If you want to increase your salary, you have to switch to
    management. So, as nobody wants to become the most important guy in the
    company with the lowest salary, there is no more experts.

    I think the issue is general, and so are the exceptions to it. On the
    one hand I’ve heard similar complaints about UK and US employers.

    I'm not well informed on the US employers. I heard one can be a
    programmer in US with a very good salary which is impossible in France.
    I mean compared with a manager, not compared with a low job salary. Now,
    I know that in US the salary are higher than in France. Which is very
    good for a young guy without kid. For someone with kids, the difference
    in salary isn't as good. And for an older guy with health issues, the difference in salary isn't as good, neither.

    So, I don't really know and it's very difficult to compare the salaries
    between France and US because the taxes are very different.

    And most importantly, things evolved very fast, so if you become an
    expert on something which disappear, you switch very fast from very
    required guy to useless guy. So, before becoming an expert, you need to
    be sure your skills will stay useful until you retire. Which is
    difficult if you are young.

    Specializing in the wrong thing is certainly a risk. For any given
    technology it’s useful to be able to look past what its boosters (and detractors) say about it to whether it does anything useful in reality.

    It's very difficult to know how to specialize. And today more than ever
    because of the AI. One can know what AI can do today and what it can't
    do. But one can't know what it will be able to do tomorrow. And one
    can't know how it will be used tomorrow. One can have guesses, but one
    can't know. One thing is sure: some jobs will be impacted. For some jobs
    the impact can be easy to guess, but it remains a guess. Which can be
    proven wrong with AI evolution.

    I'm not saying AI are good or bad. I'm just saying AI will impact jobs.
    And I'm saying the impact is impossible to know for sure. Some can have
    strong opinion about it, it still remain some guess impossible to prove.

    --
    Si vous avez du temps à perdre :
    https://scarpet42.gitlab.io

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From =?UTF-8?Q?St=C3=A9phane?= CARPENTIE@21:1/5 to All on Sun Apr 6 12:17:46 2025
    XPost: comp.os.linux.advocacy

    [En-tête "Followup-To:" positionné à comp.os.linux.misc.]
    Le 06-04-2025, Farley Flud <ff@linux.rocks> a écrit :
    On 06 Apr 2025 08:59:33 GMT, Stéphane CARPENTIER wrote:

    Le 05-04-2025, Farley Flud <ff@linux.rocks> a écrit :

    No, but we can move to quantum computing, which may become
    a reality before too long.

    I heard about that before I was born.

    In the US, the NIST is already researching algorithms for "post-quantum cryptography:"

    https://csrc.nist.gov/projects/post-quantum-cryptography

    Yes, the algorithms are farther away from the computers. Doesn't that
    ring a bell?

    Quantum computing is definitely going to happen.

    Yes, I know. Soon. Very soon. It's almost there. I heard that before I was born.

    --
    Si vous avez du temps à perdre :
    https://scarpet42.gitlab.io

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From rbowman@21:1/5 to All on Sun Apr 6 16:58:22 2025
    On 06 Apr 2025 08:52:37 GMT, Stéphane CARPENTIER wrote:

    And most importantly, things evolved very fast, so if you become an
    expert on something which disappear, you switch very fast from very
    required guy to useless guy. So, before becoming an expert, you need to
    be sure your skills will stay useful until you retire. Which is
    difficult if you are young.

    You need to become an expert at keeping up.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Richard Kettlewell@21:1/5 to sc@fiat-linux.fr on Sun Apr 6 18:38:19 2025
    Stéphane CARPENTIER <sc@fiat-linux.fr> writes:
    Le 06-04-2025, Farley Flud <ff@linux.rocks> a écrit :
    On 06 Apr 2025 08:59:33 GMT, Stéphane CARPENTIER wrote:
    Le 05-04-2025, Farley Flud <ff@linux.rocks> a écrit :

    No, but we can move to quantum computing, which may become
    a reality before too long.

    I heard about that before I was born.

    In the US, the NIST is already researching algorithms for "post-quantum
    cryptography:"

    https://csrc.nist.gov/projects/post-quantum-cryptography

    Yes, the algorithms are farther away from the computers. Doesn't that
    ring a bell?

    Not quite sure what the argument is here, but “already researching” is severely behind the times. Multiple PQC algorithms are well past the
    research stage, with finalized standards published in August and a
    couple more on the way. Adaptation of higher-level standards (APIs, PKI,
    etc) and adoption leading to deployment is well underway.

    Quantum computing is definitely going to happen.

    Yes, I know. Soon. Very soon. It's almost there. I heard that before I
    was born.

    I’m not sure anyone thinks quantum computing is “almost there” in the sense of e.g. a quantum computer big enough to break RSA existing this
    year. However the risk is real enough to be worth actively mitigating,
    both because we may not know when the first one is deployed and due to
    the “harvest now, decrypt later” strategy.

    --
    https://www.greenend.org.uk/rjk/

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From =?UTF-8?Q?St=C3=A9phane?= CARPENTIE@21:1/5 to All on Sun Apr 6 20:23:42 2025
    Le 06-04-2025, Richard Kettlewell <invalid@invalid.invalid> a écrit :
    Stéphane CARPENTIER <sc@fiat-linux.fr> writes:
    Le 06-04-2025, Farley Flud <ff@linux.rocks> a écrit :
    On 06 Apr 2025 08:59:33 GMT, Stéphane CARPENTIER wrote:
    Le 05-04-2025, Farley Flud <ff@linux.rocks> a écrit :

    No, but we can move to quantum computing, which may become
    a reality before too long.

    I heard about that before I was born.

    In the US, the NIST is already researching algorithms for "post-quantum
    cryptography:"

    https://csrc.nist.gov/projects/post-quantum-cryptography

    Yes, the algorithms are farther away from the computers. Doesn't that
    ring a bell?

    Not quite sure what the argument is here,

    I mean the algorithms are very well advanced. They just need a computer
    to switch from ready to usable.

    but “already researching” is
    severely behind the times. Multiple PQC algorithms are well past the
    research stage, with finalized standards published in August and a
    couple more on the way. Adaptation of higher-level standards (APIs, PKI,
    etc) and adoption leading to deployment is well underway.

    Yep. The algorithms. For the big computers, it's another story.

    Quantum computing is definitely going to happen.

    Yes, I know. Soon. Very soon. It's almost there. I heard that before I
    was born.

    I’m not sure anyone thinks quantum computing is “almost there” in the sense of e.g. a quantum computer big enough to break RSA existing this
    year.

    The one I was responding to does.

    However the risk is real enough to be worth actively mitigating,
    both because we may not know when the first one is deployed and due to
    the “harvest now, decrypt later” strategy.

    It depends on what you want to protect. Sometimes the information does
    need to be secret only for a short period. You never want anyone to know something, so yes, it's better to consider the quantum computer is
    already there, like that there won't be a surprise later.

    But the quantum computers aren't designed only to break RSA. And on
    other ways to use them, I'm far from sure waiting for them is always the
    best solution.

    --
    Si vous avez du temps à perdre :
    https://scarpet42.gitlab.io

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Richard Kettlewell@21:1/5 to sc@fiat-linux.fr on Sun Apr 6 23:14:16 2025
    Stéphane CARPENTIER <sc@fiat-linux.fr> writes:
    Le 06-04-2025, Richard Kettlewell <invalid@invalid.invalid> a écrit :
    Stéphane CARPENTIER <sc@fiat-linux.fr> writes:
    Le 06-04-2025, Farley Flud <ff@linux.rocks> a écrit :
    On 06 Apr 2025 08:59:33 GMT, Stéphane CARPENTIER wrote:
    Le 05-04-2025, Farley Flud <ff@linux.rocks> a écrit :
    No, but we can move to quantum computing, which may become
    a reality before too long.

    I heard about that before I was born.

    In the US, the NIST is already researching algorithms for "post-quantum >>>> cryptography:"

    https://csrc.nist.gov/projects/post-quantum-cryptography

    Yes, the algorithms are farther away from the computers. Doesn't that
    ring a bell?

    Not quite sure what the argument is here,

    I mean the algorithms are very well advanced. They just need a computer
    to switch from ready to usable.

    They do not. PQC algorithms run on ordinary computers.

    but “already researching” is severely behind the times. Multiple PQC
    algorithms are well past the research stage, with finalized standards
    published in August and a couple more on the way. Adaptation of
    higher-level standards (APIs, PKI, etc) and adoption leading to
    deployment is well underway.

    Yep. The algorithms. For the big computers, it's another story.

    I suspect you (and perhaps the previous poster) have misunderstood what post-quantum cryptography is.

    --
    https://www.greenend.org.uk/rjk/

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From c186282@21:1/5 to All on Sun Apr 6 21:42:45 2025
    Ummm ... given option ... I'd rewrite SSA/IRS using
    one of the BSDs (maybe a commercial version) as the
    OS base ... not Linux. The BSDs are 'more stable'
    while Linux has a zillion distros/variants and now
    changes fundamental things too often, chasing the
    GUI 'customer base' too zealously.

    Hell, FreeBSD distros don't even COME with a GUI,
    you have to install it afterwards and tweak it in.

    Apple bought a commercial Unix and tweaked that into
    its current OS. RUMORS that M$ is on a similar path,
    albeit step-wise. Big Govt can/should do the same.

    Just sayin' ...

    Well, they could always use Plan-9 ... which ain't
    awful AND designed to link lots of servers and
    such together, kinda wasted on a single box. Saw
    a thing a couple years ago where the team was
    celebrating porting '9' over to those big black
    IBM super-server clusters. Those would run a
    large govt entity with room to spare.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From c186282@21:1/5 to Richard Kettlewell on Sun Apr 6 22:49:53 2025
    On 4/6/25 1:38 PM, Richard Kettlewell wrote:
    Stéphane CARPENTIER <sc@fiat-linux.fr> writes:
    Le 06-04-2025, Farley Flud <ff@linux.rocks> a écrit :
    On 06 Apr 2025 08:59:33 GMT, Stéphane CARPENTIER wrote:
    Le 05-04-2025, Farley Flud <ff@linux.rocks> a écrit :

    No, but we can move to quantum computing, which may become
    a reality before too long.

    I heard about that before I was born.

    In the US, the NIST is already researching algorithms for "post-quantum
    cryptography:"

    https://csrc.nist.gov/projects/post-quantum-cryptography

    Yes, the algorithms are farther away from the computers. Doesn't that
    ring a bell?

    Not quite sure what the argument is here, but “already researching” is severely behind the times. Multiple PQC algorithms are well past the
    research stage, with finalized standards published in August and a
    couple more on the way. Adaptation of higher-level standards (APIs, PKI,
    etc) and adoption leading to deployment is well underway.

    Quantum computing is definitely going to happen.

    Yes, I know. Soon. Very soon. It's almost there. I heard that before I
    was born.

    I’m not sure anyone thinks quantum computing is “almost there” in the sense of e.g. a quantum computer big enough to break RSA existing this
    year. However the risk is real enough to be worth actively mitigating,
    both because we may not know when the first one is deployed and due to
    the “harvest now, decrypt later” strategy.

    Ok ... WITHIN LIMITS ... quantum DOES work. It IS good
    for breaking some kinds of common encryption (there ARE
    now quantum-resistant crypto algos - but beware spook
    back-doors).

    Alas quantum is VERY iffy and errors are still a huge problem.
    That may NOT be totally solvable, given the indeterminacy
    issue in QM. Wonder if it's possible to leverage one
    kind of indeterminacy against another as compensation ?

    But, as a 'general solution' ... do NOT see quantum
    as a mainstream computing tech. Big corps will RENT
    it, for big $$$, to entities that CAN make good use
    of the technique. Is NOT gonna be running yer laptop
    or phone.

    SSA/IRS ... as said elsewhere, "Linux" is too all over
    the place now. A commercial Unix is the better OS base.
    That's what Apple did, and what Big G should do.

    The replacement CODE ... either Python or, horrors,
    a BASIC with a compiler. Easy, readable, fair base
    of experts. Buy one of those big black IBM mainframe
    clusters and PARTY ON.

    Note that Plan-9 was ported over to those ... and
    it's designed for big/distributed systems.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Rich@21:1/5 to c186282@nnada.net on Mon Apr 7 03:33:43 2025
    c186282 <c186282@nnada.net> wrote:
    Ummm ... given option ... I'd rewrite SSA/IRS using
    one of the BSDs (maybe a commercial version) as the

    The problem that will be encountered in "rewriting" SSA or IRS is not
    the software.

    The problem area is the 'rule book' defining what the software is to
    do. The first problem is, there is no single "rule book" with which to
    refer. It is all spread over thousands of statutes that themselves
    have been patched plural (millions?) of times throughout the years both
    SSA and IRS have been around. If one could collect all the 'rules' of
    what should happen given specific inputs together into a single 'rule
    book' and print it out the result would likely be a 6 foot high stack
    of double sided US letter sheets of paper.

    And the rules will be things like (made up, but the actual rules are
    just as arcane):

    Person X receives 4.75% of their total SSA payments over their lifetime
    as pension, unless they are also a veteran, in which case they receive
    6.25%, but if they served in the Airborne rangers from 1975 to 1982
    they get an additional 1.27%, however if they also worked for the NSA
    from 1987 to 1993 they receive 1.87% less. However, for payments from
    1957 to 1962, they receive a 3.2% bonus, but for payments made from
    1967 to 1974 they take a 1.4% penalty. Further, if the payments were
    for self employment income from 1975 to 1986 they get a 3.2% bonus.
    Etc.

    Think about the arcane tax rules for what numbers to put where on the
    tax forms every year, the SSA rules are very much like the tax rules
    (because both have been created, piecemeal, over the course of decades,
    by different politicians getting patches to the statutes through
    congress).

    The problem that will be encountered is that the existing code base has
    been built up over the decades in concert with the politicans making
    changes, so both evolved in concert, and each change was incremental at
    the time. But trying to rewrite it all from the ground up is going to
    quickly hit the quagmire of exponential complexity just to understand
    all the rules about what to do when for some payment Y (or for some tax
    filing Z). The result will be something that either screws up royally
    at every result, or simply omits 95+% of all the arcane, interdependent,
    things the congres folk have added to the statutes over the decades
    (and someone loses their SS payment the statues say they should
    receive).

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From c186282@21:1/5 to Rich on Mon Apr 7 01:04:50 2025
    On 4/6/25 11:33 PM, Rich wrote:
    c186282 <c186282@nnada.net> wrote:
    Ummm ... given option ... I'd rewrite SSA/IRS using
    one of the BSDs (maybe a commercial version) as the

    The problem that will be encountered in "rewriting" SSA or IRS is not
    the software.


    Ummmmm ... kinda. It HAS to work, kinda flawlessly
    from the start. The political fallout from errors
    is TOO much.


    The problem area is the 'rule book' defining what the software is to
    do. The first problem is, there is no single "rule book" with which to refer. It is all spread over thousands of statutes that themselves
    have been patched plural (millions?) of times throughout the years both
    SSA and IRS have been around. If one could collect all the 'rules' of
    what should happen given specific inputs together into a single 'rule
    book' and print it out the result would likely be a 6 foot high stack
    of double sided US letter sheets of paper.


    "Federal laws/regs" have been tweaked and re-tweaked
    by politicians since the inception. They respond to
    'pressure groups', the 'squeaky wheel', who want SOME
    special path/exemption. This is Politics-As-Usual.

    The PROB is when you get like a CENTURY of this heaped
    upon itself .......

    How DO you write software for such a MESS ???


    And the rules will be things like (made up, but the actual rules are
    just as arcane):

    Person X receives 4.75% of their total SSA payments over their lifetime
    as pension, unless they are also a veteran, in which case they receive
    6.25%, but if they served in the Airborne rangers from 1975 to 1982
    they get an additional 1.27%, however if they also worked for the NSA
    from 1987 to 1993 they receive 1.87% less. However, for payments from
    1957 to 1962, they receive a 3.2% bonus, but for payments made from
    1967 to 1974 they take a 1.4% penalty. Further, if the payments were
    for self employment income from 1975 to 1986 they get a 3.2% bonus.
    Etc.

    Think about the arcane tax rules for what numbers to put where on the
    tax forms every year, the SSA rules are very much like the tax rules
    (because both have been created, piecemeal, over the course of decades,
    by different politicians getting patches to the statutes through
    congress).

    The problem that will be encountered is that the existing code base has
    been built up over the decades in concert with the politicans making
    changes, so both evolved in concert, and each change was incremental at
    the time. But trying to rewrite it all from the ground up is going to quickly hit the quagmire of exponential complexity just to understand
    all the rules about what to do when for some payment Y (or for some tax filing Z). The result will be something that either screws up royally
    at every result, or simply omits 95+% of all the arcane, interdependent, things the congres folk have added to the statutes over the decades
    (and someone loses their SS payment the statues say they should
    receive).

    The "existing code base" is mostly 60s COBOL so far as
    I can tell.

    It's been tweaked and tweaked and tweaked until it's
    a DISASTER nobody REALLY knows how to deal with.

    Whatever replaces it not only has to accommodate all
    the (oft ridiculous/illogical) tweaks but easily deal
    with any NEW tweaks in a sensible organized fashion.

    NOT easy at all.

    Hey, this is GOVERNMENT stuff ... not NASA calx.
    SUBJECTIVE results are paramount.

    Just one of those big black IBM mainframe clusters
    could run entire govt agencies - indeed even tie them
    together in an organized fashion. BUT, gotta have
    sensible underlying software/systems built on
    comprehensible, expandable, code.

    IS this possible - or did we create a MONSTER
    in the 60s ?

    The never-ending political tweaks ... a rule base
    COULD be implemented. This WILL include from/to/
    situational dates of relevance and such. GOTTA
    be easy to add new/over-riding rules.

    IMHO - I'm still gonna rec a UNIX base OS, then
    Python, maybe even BASIC, as the code base. All
    easy, no BS, no mystery, kinda self-doc. This
    should be good for another 50 years.

    After that ... it's gonna be all 'AI' and humans
    won't "get it" at all. Magic .......

    Anyway, anyone who thinks this is all EASY is
    full of it.

    Oh, valid arg, do you try to re-implement the
    current code base - or instead emulate what
    it DOES ? After consideration I trend towards
    the latter solution. Re-writing all that old
    COBOL would likely NOT yield good results ...
    so instead observe what it DOES - and write
    new code to do the same thing by alt means.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From The Natural Philosopher@21:1/5 to All on Mon Apr 7 11:56:10 2025
    On 07/04/2025 03:49, c186282 wrote:
    A commercial Unix is the better OS base.
      That's what Apple did, and what Big G should do.

    IIRC Apple OS/X is based on Free BSD


    --
    In todays liberal progressive conflict-free education system, everyone
    gets full Marx.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From c186282@21:1/5 to The Natural Philosopher on Mon Apr 7 07:08:30 2025
    On 4/7/25 6:56 AM, The Natural Philosopher wrote:
    On 07/04/2025 03:49, c186282 wrote:
    A commercial Unix is the better OS base.
       That's what Apple did, and what Big G should do.

    IIRC Apple OS/X is based on Free BSD

    But a 'commercial' variant - kinda like RHEL is
    a commercial Linux variant. Apple PAID for it.

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From The Natural Philosopher@21:1/5 to All on Mon Apr 7 12:20:47 2025
    On 07/04/2025 12:08, c186282 wrote:
    On 4/7/25 6:56 AM, The Natural Philosopher wrote:
    On 07/04/2025 03:49, c186282 wrote:
    A commercial Unix is the better OS base.
       That's what Apple did, and what Big G should do.

    IIRC Apple OS/X is based on Free BSD

      But a 'commercial' variant - kinda like RHEL is
      a commercial Linux variant. Apple PAID for it.
    Evidence?

    --
    "If you don’t read the news paper, you are un-informed. If you read the
    news paper, you are mis-informed."

    Mark Twain

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Farley Flud@21:1/5 to All on Tue Apr 8 11:59:33 2025
    XPost: comp.os.linux.advocacy

    On Mon, 07 Apr 2025 17:59:45 -0400, c186282 wrote:


    Indeed ! However ... probably COULD be done, it's
    a bunch of shifting values - input to some accts,
    calx ops, shift to other accts ....... lots and
    lots of rheostats ........


    How is conditional branching (e.g. an if-then-else statement)
    to be implemented with analog circuits? It cannot be
    done.

    Analog computers are good for modelling systems that are
    described by differential equations. Adders, differentiators,
    and integrators can all be easily implemented with electronic
    circuits. But beyond differential equation sytems analog
    computers are useless.

    The Norden bomb site of WWII wan an electro-mechanical
    computer. It's job was to calculate the trajectory of
    a bomb released by an aircraft and the trajectory is described
    by a differential equation.

    One of my professors told a story about a common "analog"
    practice among engineers of the past. To calculate an integral,
    which can be described as the area under a curve, they would plot
    the curve on well made paper and then cut out (with scissors)
    the plotted area and weigh it (on a lab balance). The ratio
    of the cut-out area with a unit area of paper would be the
    value of the integral. (Multi-dimensional integrals would
    require carving blocks of balsa wood or a similar material.)

    Of course it worked but today integration is easy to perform
    to unlimited accuracy using digital means.


    --
    Hail Linux! Hail FOSS! Hail Stallman!

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)