On Thu, 1 Jan 2026 23:54:37 -0000 (UTC), Waldek Hebisch wrote:
IMO biggest drawback of Turbo Pascal was poor speed of generated code
(and size too). For me deal breaker was fact that Turbo Pascal was
16-bit and tied to DOS. DJGCC gave me 32-bit integers and slightly
later I switched to Linux, so Turbo Pascal was not longer relevant for
me. But if you were programming 16-bit DOS and did not mind poor speed
of generated code, than IMO Turbo Pascal was quite decent programming
language, quite competitive in expressivity to C.
I never used the DOS TurboPascal, only the CP/M version. I used the BDS C >subset compiler on CP/M and moved to DJGPP eventually.
On 1/1/26 16:54, Waldek Hebisch wrote:
In alt.folklore.computers The Natural Philosopher <tnp@invalid.invalid>
wrote:
On 01/01/2026 14:28, Peter Flass wrote:
On 1/1/26 05:49, The Natural Philosopher wrote:IIRC it (Turbo Pascal. The amateurs language) had unions of some sort,
On 01/01/2026 03:07, c186282 wrote:
On 12/31/25 17:35, The Natural Philosopher wrote:My one and only-a experience of trying to make Pascal do what was
On 31/12/2025 19:21, c186282 wrote:
I've writ stuff with five or six levels of nesting
-a-a but don't like it, usually if/then/else stuff. Oft re-did >>>>>>>> -a-a it later to be more easy to follow. IMHO
-a-a readability/comprehensibility is as important as
-a-a functionally correct code.
100% agree.
Often write little functions that are only called once. Merely to >>>>>>> lexically separate atomic functional blocks.
No idea whether the compiler/linker inlines them or not.
There is nothing worse than making top level decisions followed by >>>>>>> some nitty detail to detect some low level error.
e.g. assume a call to allocate memory always works or the call
will do the appropriate jump to a global error handler to abort
things cleanly.
The point of structure was supposed to be to elucidate program
flow,
not obscure it with elegant formally correct cruft.
-a-a Agree.
-a-a As I've said before, I'm still quite fond of Pascal and write >>>>>> -a-a apps of various size in it (oft first proto-ed in Python).
-a-a The structure is 'elegant', but you CAN carry it TOO far, to >>>>>> -a-a where it gets in the way instead of helping things.
trivial in 'C' led me to resolve never ever to touch it again.
If you are trying to write - as it turned out I was - a disk driver
in pascal, where a given sector may be a byte stream, a series of 16 >>>>> bit integers,-a or a structure defined by thee first few bytes in the >>>>> sector, you end up with a massive union that is so cumbersome it is
almost impossible to read - let alone use.
Doesn't Pascal have variant records?
but I would have needed about 100 to cover all cases and it was even
then messy.
Turbo Pascal could do essentially all thar C could do (and do things
which were not strightforward in C, but this is irrelevant here). And
do this in a very similar way, once you knew how Turbo Pascal
constructs worked. If you really needed 100 variant record in Turbo
Pascal,
then you needed 100 unions in C. If you could do this more simply in
C, you could do this more simply in Turbo Pascal too.
Given what you wrote, it looks that you simply lacked experience
writing Turbo Pascal. In other words, you were unqualified to do the
job that you were supposed to do (write the driver in Turbo Pascal), so
you decided to do thing that you know how to do, that is to write it in
C.
IMO biggest drawback of Turbo Pascal was poor speed of generated code
(and size too). For me deal breaker was fact that Turbo Pascal was
16-bit and tied to DOS. DJGCC gave me 32-bit integers and slightly
later I switched to Linux, so Turbo Pascal was not longer relevant for
me. But if you were programming 16-bit DOS and did not mind poor speed
of generated code, than IMO Turbo Pascal was quite decent programming
language, quite competitive in expressivity to C.
Now there's Free Pascal. I'm not a Pascal programmer, but I admit I was impressed when I looked at what's in the package.
On 01/01/2026 02:50, c186282 wrote:
On 12/31/25 17:28, The Natural Philosopher wrote:That is your metaphysical assumption. It doesn't make it true.
On 31/12/2025 16:46, c186282 wrote:
On 12/31/25 10:08, The Natural Philosopher wrote:You could, but I wouldn't.]
On 31/12/2025 14:21, c186282 wrote:
I'm not qualified to fine-critique Penrose. However
-a-a when he insisted brains MUST be quantum ... some
-a-a little red light went off in my head.
Yes. To anyone who has studied Kant, it is clear that it is the
mind that invented 'quantum theory'...so to make it an emergent
property of its own creation, is the wrong sort of feedback
-a-a Well, you can argue that the QM nature of brain/mind
-a-a always existed - but it's only just now we (Penrose)
-a-a figured it out. :-)
QM is just another invention of the mind. What it refers to may well
not be of the mind though.
And it makes the analysis simpler to consider that it is not.
-a-a Let's say things SEEM to be 'quantum'. But then
-a-a we're little 3-D beings barely out of the trees
-a-a and still sometimes throw shit at each other.
-a-a Strictly, everything is 'quantum' anyhow, protons,
-a-a electrons, quarks, everything.
No. that is a *metaphysical* assumption. we can assume it pro tem to
see where it gets us. Into a right buggers muddle. Along with Penrose.
Assume instead-a that consciousness is absolutely independent of
quantum reality and redraw the relationships.
-a-a That we both seem to agree on ... at least insofar
-a-a as 'mind' goes. The 'material' stuff of brains,
-a-a there, so far as we can tell, quantum defines its
-a-a existence/actions at the ultra-fine level, but
-a-a we can have 'consciousness' without having to
-a-a worry about that tiny stuff.
If you examine the matter at the most fundamental level, you discover
that all classical science and the classical worldview implicitly
depends on the concept of the 'detached observer' . I.e. a consciousness that stands outside of that which it observes and whose observations do
not affect the thing under observation.
It is *defined* to be immaterial. A late-model version of the 'immortal soul'.-a-a That is the concept of this immaterial and immortal entity that stands outside of time and space peering in, is *implicit* in the
classical worldview.
And yet scientists want to make it an emergent property of the worldview
it studies..
That cant be done without contradiction.
-a-a Computers can be made to compute using quite a numberWell that is one rather less sophisticated version of the same thing, yes. What comprises the material world is real, but not as we know it, Jim.
-a-a of physical media - hell, you could make a 'hydraulic
-a-a computer' if you had the space, one out of wooden parts,
-a-a and it would be as accurate as any 2nm transistor model.
-a-a The logic is the logic, independent of the means.
-a-a Neuron networks are just another 'means'.
It all becomes simpler.
-a-a I suspect we're drifting towards Buddhism here ... and
-a-a I learned long ago to bail out once a certain level of
-a-a 'metaphysics' creeps in-a :-)
It is a *transform* of it. And the agency doing that transform is the mind/consciousness/spirit/soul or whatever BS name you want to refer to
it by.
That is the minimum number of elements *necessary* for an entity to
become aware of an externality.
Something that-a has been blindingly obvious for thousands of years.,
-a-a Gimme what demonstrably WORKS, what is USEFUL. FuckWell all science is ultimately about what (seems to) work. The problem
-a-a args about the 'fine context(s)/interpretation(s)",
-a-a the "Game Of Nuances and Twisted Semantics". People
-a-a have been at this for many thousands of years,
-a-a endlessly re-arranging an arcade of fun-house mirrors,
-a-a "If you look at it all like THIS you shall find
-a-a the Great Truth" ........
of consciousness-a is that it doesn't work 'like wot it orta'.
Hence the need for a different metaphysical rule set to accommodate it.
Just as Einstein had to rewrite the concept of absolute space and time. Because the experimental results didn't make sense otherwise.
The transcendental idealism of Kant et al makes it all work, but at the expense of completely abandoning the classical world of everyday sense
as *primary*.
And sticking human consciousness as more primary, in its place.
Which is unacceptable to the vast number of scientists reared on the
creed of material realism.
Hence the dichotomy. And hand waving of consciousness as 'just quantum
shit, or something'
I won't ever have a smart speaker, and I'll be damned if I'm going to
have a vacuum cleaner that cases the joint and reports back to the
mother ship. Besides, I have better ways to entertain the cats.
Back when electronics became cheap, remember how clocks were
incorporated into just about everything? I had a ball-point pen with a
clock in it.
NNs are 'different'. Not 'expert', not 'fuzzy', not LLM. A little
closer to how biological brains work. The bitch has been finding
suitable elements that can be compactly put on chips. They're getting
better at that. Maybe 10 years and decently good 'AI' will fit INSIDE
a bot instead of a 20 acre gigawatt data center.
Turbo Pascal for CP/M-86 could access the graphics hardware on the
DEC Rainbow. A niche to be sure, but one my CSCI graphics class did
its projects in.
On 01/01/2026 03:07, c186282 wrote:
On 12/31/25 17:35, The Natural Philosopher wrote:My one and only-a experience of trying to make Pascal do what was trivial
On 31/12/2025 19:21, c186282 wrote:
I've writ stuff with five or six levels of nesting
-a-a but don't like it, usually if/then/else stuff. Oft
-a-a re-did it later to be more easy to follow. IMHO
-a-a readability/comprehensibility is as important as
-a-a functionally correct code.
100% agree.
Often write little functions that are only called once. Merely to
lexically separate atomic functional blocks.
No idea whether the compiler/linker inlines them or not.
There is nothing worse than making top level decisions followed by
some nitty detail to detect some low level error.
e.g. assume a call to allocate memory always works or the call will
do the appropriate jump to a global error handler to abort things
cleanly.
The point of structure was supposed to be to elucidate program flow,
not obscure it with elegant formally correct cruft.
-a-a Agree.
-a-a As I've said before, I'm still quite fond of Pascal and
-a-a write apps of various size in it (oft first proto-ed
-a-a in Python). The structure is 'elegant', but you CAN
-a-a carry it TOO far, to where it gets in the way instead
-a-a of helping things.
in 'C' led me to resolve never ever to touch it again.
If you are trying to write - as it turned out I was - a disk driver in pascal, where a given sector may be a byte stream, a series of 16 bit integers,-a or a structure defined by thee first few bytes in the sector, you end up with a massive union that is so cumbersome it is almost impossible to read - let alone use.
C's ability to say if this byte is such and such then what follows may
be considered to be a structure, or else 17 integers, or else a text string....the point being that the people who constructed the software
that wrote to the (ram) disk didn't write in Pascal. They wrote in Assembler. They had AFAICT ripped off CP/M.
I threw the pascal out and rewrote everything in a French-a B & B over
the weekend.-a In C. Probably the best work I ever did.
For which the guy who I did it for didn't pay me till I took him to court.
Whereas the best money I ever made was to go to London and get paid -u450
to snip the leg on one capacitor...
On 1/1/26 05:49, The Natural Philosopher wrote:
On 01/01/2026 03:07, c186282 wrote:
On 12/31/25 17:35, The Natural Philosopher wrote:My one and only-a experience of trying to make Pascal do what was
On 31/12/2025 19:21, c186282 wrote:
I've writ stuff with five or six levels of nesting
-a-a but don't like it, usually if/then/else stuff. Oft
-a-a re-did it later to be more easy to follow. IMHO
-a-a readability/comprehensibility is as important as
-a-a functionally correct code.
100% agree.
Often write little functions that are only called once. Merely to
lexically separate atomic functional blocks.
No idea whether the compiler/linker inlines them or not.
There is nothing worse than making top level decisions followed by
some nitty detail to detect some low level error.
e.g. assume a call to allocate memory always works or the call will
do the appropriate jump to a global error handler to abort things
cleanly.
The point of structure was supposed to be to elucidate program flow,
not obscure it with elegant formally correct cruft.
-a-a Agree.
-a-a As I've said before, I'm still quite fond of Pascal and
-a-a write apps of various size in it (oft first proto-ed
-a-a in Python). The structure is 'elegant', but you CAN
-a-a carry it TOO far, to where it gets in the way instead
-a-a of helping things.
trivial in 'C' led me to resolve never ever to touch it again.
If you are trying to write - as it turned out I was - a disk driver in
pascal, where a given sector may be a byte stream, a series of 16 bit
integers,-a or a structure defined by thee first few bytes in the
sector, you end up with a massive union that is so cumbersome it is
almost impossible to read - let alone use.
Doesn't Pascal have variant records?
On 2026-01-01 15:28, Peter Flass wrote:
On 1/1/26 05:49, The Natural Philosopher wrote:
On 01/01/2026 03:07, c186282 wrote:
On 12/31/25 17:35, The Natural Philosopher wrote:
On 31/12/2025 19:21, c186282 wrote:
-a-a Agree.My one and only-a experience of trying to make Pascal do what was
-a-a As I've said before, I'm still quite fond of Pascal and
-a-a write apps of various size in it (oft first proto-ed
-a-a in Python). The structure is 'elegant', but you CAN
-a-a carry it TOO far, to where it gets in the way instead
-a-a of helping things.
trivial in 'C' led me to resolve never ever to touch it again.
If you are trying to write - as it turned out I was - a disk driver
in pascal, where a given sector may be a byte stream, a series of 16
bit integers,-a or a structure defined by thee first few bytes in the
sector, you end up with a massive union that is so cumbersome it is
almost impossible to read - let alone use.
Doesn't Pascal have variant records?
Free Pascal at least does.
https://www.freepascal.org/docs-html/ref/refsu15.html
I have a book somewhere that came with a floppy, and it had several
examples of using files with variant parts. It was easy.
On Thu, 1 Jan 2026 10:30:54 -0000 (UTC), Waldek Hebisch wrote:
But are 'expert systems' really AI?
What is really rCLAIrCY? At one point, the argument was over whether computers could rCLthinkrCY. Then you had to define rCLthinkingrCY, and somebody tried to settle the question by saing: rCLthinking is what
computers cannot dorCY.
The only succinct definition of rCLAIrCY I ever saw was: rCLsolving NP problems in polynomial timerCY.
According to c186282 <c186282@nnada.net>:
If you know something ABOUT 'the pad' - like how
many letters/numbers and how it's used - that may
offer some attack options, at least narrow things
down at bit.
No, a real OTP is unbreakable. The problem is that for every byte of
message you need a byte of key, so distributing the keys and using
them correctly is a logistical nightmare.
According to c186282 <c186282@nnada.net>:
If you know something ABOUT 'the pad' - like how
many letters/numbers and how it's used - that may
offer some attack options, at least narrow things
down at bit.
No, a real OTP is unbreakable. The problem is that for every byte
of message you need a byte of key, so distributing the keys and
using them correctly is a logistical nightmare.
Venona decrpted Soviet messages that used OTPs because
sone of the putative OTPs in fact were used more than once
which was enough to let the US crack them.
On Thu, 1 Jan 2026 23:54:37 -0000 (UTC), Waldek Hebisch wrote:
IMO biggest drawback of Turbo Pascal was poor speed of generated code
(and size too). For me deal breaker was fact that Turbo Pascal was
16-bit and tied to DOS. DJGCC gave me 32-bit integers and slightly
later I switched to Linux, so Turbo Pascal was not longer relevant for
me. But if you were programming 16-bit DOS and did not mind poor speed
of generated code, than IMO Turbo Pascal was quite decent programming
language, quite competitive in expressivity to C.
I never used the DOS TurboPascal, only the CP/M version. I used the BDS C subset compiler on CP/M and moved to DJGPP eventually.
John Levine <johnl@taugh.com> writes:
According to c186282 <c186282@nnada.net>:
If you know something ABOUT 'the pad' - like how
many letters/numbers and how it's used - that may
offer some attack options, at least narrow things
down at bit.
No, a real OTP is unbreakable. The problem is that for every byte of
message you need a byte of key, so distributing the keys and using
them correctly is a logistical nightmare.
OTPs are broken in the sense that they are malleable. ItrCOs easy for an attacker to modify the encrypted message, if they know anything about
its expected structure.
For example, an encrypted financial transaction is likely to have the
amount of money to be sent at a predictable offset, so all the attacker
needs to do is flip one of the higher bits in that field and the victim spends a great deal more money than they intended. If the pad is applied using XOR (a natural approach today) then they can achieve that by
flipping the corresponding bit in the ciphertext.
The need for symmetric encryption systems to include a MAC to prevent
this kind of issue has been understood for a long time.
If you really needed 100 variant record in Turbo Pascal,
then you needed 100 unions in C.
BTW: It is normal and common for programmers to want to
rewrite/write from scratch instead of understanding and
improving existing code.-a But in most cases working on
existing code leads to better effect.
Exactly my experience.
TP let you write, test, re-write, test ... inAs I said the amateurs language. BASIC in all but name
-a mere MINUTES and helped you along all the way.
Similarly, politicians dream of re-arranging laws (and adding more,
of course, never repealing) in pursuit of the dream that the right combination of legislation will result in Paradise.
NNs are 'different'. Not 'expert', not 'fuzzy', not LLM.
-a A little closer to how biological brains work. The bitch
-a has been finding suitable elements that can be compactly
-a put on chips. They're getting better at that. Maybe 10
-a years and decently good 'AI' will fit INSIDE a bot instead
-a of a 20 acre gigawatt data center.
On 1/2/26 00:59, rbowman wrote:
On Thu, 1 Jan 2026 23:54:37 -0000 (UTC), Waldek Hebisch wrote:
IMO biggest drawback of Turbo Pascal was poor speed of generated code
(and size too).-a For me deal breaker was fact that Turbo Pascal was
16-bit and tied to DOS.-a DJGCC gave me 32-bit integers and slightly
later I switched to Linux, so Turbo Pascal was not longer relevant for
me.-a But if you were programming 16-bit DOS and did not mind poor speed >>> of generated code, than IMO Turbo Pascal was quite decent programming
language, quite competitive in expressivity to C.
I never used the DOS TurboPascal, only the CP/M version. I used the BDS C
subset compiler on CP/M and moved to DJGPP eventually.
-a Look ... consider the existing environment. It WAS
-a the M$/IBM multi-pass Pascal compiler (still have
-a that in a VM and DO use it once in awhile for fun).
-a TP was a TOTAL REVOLUTION ... not only because of
-a the integrated development environment but because
-a of the BLAZING compilation speed.
-a If/when the final code was a bit bigger than the
-a old compilers - WHO CARED ???
-a TP let you write, test, re-write, test ... in
-a mere MINUTES and helped you along all the way.
-a In short it SET THE STANDARD for how IDEs
-a should be. From there on everybody expected
-a equal or better.
-a And yes, I love Pascal ... still use FPC/Lazarus
-a quite a bit. There's just a certain 'elegance'
-a to Pascal ... reminds of composing classical
-a music somehow ........
On 01/01/2026 23:54, Waldek Hebisch wrote:
If you really needed 100 variant record in Turbo Pascal,
then you needed 100 unions in C.
No. You simply used *casting* .
k=*(int *)(buffer +4) etc etc.
I see what you're aiming at - and it's largely true.
-a There's long been the trend to trying to separate
-a 'the material' from some kind of higher/essential/
-a 'spiritual' take on things. Probably because life
-a was so long (STILL in lots of places) so SHITTY.
-a But IMHO it's a delusion, emotional cherry pie.
-a All are one and one is all ... a great Gordian knot.
On 02/01/2026 02:18, Charlie Gibbs wrote:
Similarly, politicians dream of re-arranging laws (and adding more,
of course, never repealing) in pursuit of the dream that the right
combination of legislation will result in Paradise.
You really think that they do?
In reality they would prefer to take the salary and the perks and do
fuck all. The best ones.
The worst ones are those with Big Beautiful Ideas.
Most problems that haven't been solved already are not amenable to
political interference anyway: the best thing is to give people the
freedom to sort them, themselves.
On 2026-01-02 11:37, c186282 wrote:
On 1/2/26 00:59, rbowman wrote:
On Thu, 1 Jan 2026 23:54:37 -0000 (UTC), Waldek Hebisch wrote:
IMO biggest drawback of Turbo Pascal was poor speed of generated code
(and size too).-a For me deal breaker was fact that Turbo Pascal was
16-bit and tied to DOS.-a DJGCC gave me 32-bit integers and slightly
later I switched to Linux, so Turbo Pascal was not longer relevant for >>>> me.-a But if you were programming 16-bit DOS and did not mind poor speed >>>> of generated code, than IMO Turbo Pascal was quite decent programming
language, quite competitive in expressivity to C.
I never used the DOS TurboPascal, only the CP/M version. I used the
BDS C
subset compiler on CP/M and moved to DJGPP eventually.
-a-a Look ... consider the existing environment. It WAS
-a-a the M$/IBM multi-pass Pascal compiler (still have
-a-a that in a VM and DO use it once in awhile for fun).
-a-a TP was a TOTAL REVOLUTION ... not only because of
-a-a the integrated development environment but because
-a-a of the BLAZING compilation speed.
-a-a If/when the final code was a bit bigger than the
-a-a old compilers - WHO CARED ???
I don't remember at what version, 4 or 6, the binary program became much smaller. A HelloWorld was roughly 2 KB, while in C it was 28. They
invented smart linking.
-a-a TP let you write, test, re-write, test ... in
-a-a mere MINUTES and helped you along all the way.
-a-a In short it SET THE STANDARD for how IDEs
-a-a should be. From there on everybody expected
-a-a equal or better.
-a-a And yes, I love Pascal ... still use FPC/Lazarus
-a-a quite a bit. There's just a certain 'elegance'
-a-a to Pascal ... reminds of composing classical
-a-a music somehow ........
On 2026-01-02 11:59, The Natural Philosopher wrote:
On 01/01/2026 23:54, Waldek Hebisch wrote:
If you really needed 100 variant record in Turbo Pascal,
then you needed 100 unions in C.
No. You simply used *casting* .
k=*(int *)(buffer +4) etc etc.
Borland Pascal also had typecasting.
BYTE(MyChar)
On 1/2/26 06:27, Carlos E.R. wrote:
On 2026-01-02 11:59, The Natural Philosopher wrote:
On 01/01/2026 23:54, Waldek Hebisch wrote:
If you really needed 100 variant record in Turbo Pascal,
then you needed 100 unions in C.
No. You simply used *casting* .
k=*(int *)(buffer +4) etc etc.
Borland Pascal also had typecasting.
BYTE(MyChar)
-a Yep. TP was a slight 'super-set' of Wirth Pascal.
-a Cleaned up a few lackings. Wirth, though practical,
-a was still kind of an 'academic' and didn't always
-a address typical real-world problems. Easy type-casts
-a made things a LOT better.
On 02/01/2026 11:54, c186282 wrote:
On 1/2/26 06:27, Carlos E.R. wrote:How did it handle pointers...?
On 2026-01-02 11:59, The Natural Philosopher wrote:
On 01/01/2026 23:54, Waldek Hebisch wrote:
If you really needed 100 variant record in Turbo Pascal,
then you needed 100 unions in C.
No. You simply used *casting* .
k=*(int *)(buffer +4) etc etc.
Borland Pascal also had typecasting.
BYTE(MyChar)
-a-a Yep. TP was a slight 'super-set' of Wirth Pascal.
-a-a Cleaned up a few lackings. Wirth, though practical,
-a-a was still kind of an 'academic' and didn't always
-a-a address typical real-world problems. Easy type-casts
-a-a made things a LOT better.
On Fri, 02 Jan 2026 02:18:42 GMT, Charlie Gibbs wrote:
Back when electronics became cheap, remember how clocks were
incorporated into just about everything? I had a ball-point pen with a clock in it.
I used those little round stick-ons to keep track of project hours. When I couldn't find one I bought a $5 wrist watch at a flea market. The
department manager advised me I shouldn't leave a valuable watch by the monitor. At least a blue stick-on didn't look lile much.
A friend bought a very early calculator for several hundred 1970s dollars.
I must have pissed them all off but I have several calculators that were
in the begging letters from various organizations in lieu of mittens or return address stickers. The must go for 10 cents in volume.
Damn! Nobody sent me a calendar! I'm going to have to buy one. Or not.
$ cal 1 2025
January 2025
Su Mo Tu We Th Fr Sa
1 2 3 4
5 6 7 8 9 10 11
12 13 14 15 16 17 18
19 20 21 22 23 24 25
26 27 28 29 30 31
Still works!
On Fri, 2 Jan 2026 02:53:45 -0000 (UTC), Waldek Hebisch wrote:
In alt.folklore.computers Lawrence DrCOOliveiro <ldo@nz.invalid> wrote:
The only succinct definition of rCLAIrCY I ever saw was: rCLsolving NP
problems in polynomial timerCY.
Well, for me AI is process (and its results) of trying to solve
problems that we can not solve using known (at given time) methods
and which seem to require inteligence.
You donrCOt see crossing the P/NP divide as being a good indication of
such a distinction?
On 02/01/2026 11:54, c186282 wrote:
On 1/2/26 06:27, Carlos E.R. wrote:How did it handle pointers...?
On 2026-01-02 11:59, The Natural Philosopher wrote:
On 01/01/2026 23:54, Waldek Hebisch wrote:
If you really needed 100 variant record in Turbo Pascal,
then you needed 100 unions in C.
No. You simply used *casting* .
k=*(int *)(buffer +4) etc etc.
Borland Pascal also had typecasting.
BYTE(MyChar)
-a-a Yep. TP was a slight 'super-set' of Wirth Pascal.
-a-a Cleaned up a few lackings. Wirth, though practical,
-a-a was still kind of an 'academic' and didn't always
-a-a address typical real-world problems. Easy type-casts
-a-a made things a LOT better.
On 02/01/2026 11:54, c186282 wrote:
On 1/2/26 06:27, Carlos E.R. wrote:How did it handle pointers...?
On 2026-01-02 11:59, The Natural Philosopher wrote:
On 01/01/2026 23:54, Waldek Hebisch wrote:
If you really needed 100 variant record in Turbo Pascal,
then you needed 100 unions in C.
No. You simply used *casting* .
k=*(int *)(buffer +4) etc etc.
Borland Pascal also had typecasting.
BYTE(MyChar)
-a-a Yep. TP was a slight 'super-set' of Wirth Pascal.
-a-a Cleaned up a few lackings. Wirth, though practical,
-a-a was still kind of an 'academic' and didn't always
-a-a address typical real-world problems. Easy type-casts
-a-a made things a LOT better.
On 2 Jan 2026 06:32:41 GMT
rbowman <bowman@montana.com> wrote:
On Fri, 02 Jan 2026 02:18:42 GMT, Charlie Gibbs wrote:May I be the first to welcome you back to the start of last year, I hope
Back when electronics became cheap, remember how clocks were
incorporated into just about everything? I had a ball-point pen with a
clock in it.
I used those little round stick-ons to keep track of project hours. When I >> couldn't find one I bought a $5 wrist watch at a flea market. The
department manager advised me I shouldn't leave a valuable watch by the
monitor. At least a blue stick-on didn't look lile much.
A friend bought a very early calculator for several hundred 1970s dollars. >> I must have pissed them all off but I have several calculators that were
in the begging letters from various organizations in lieu of mittens or
return address stickers. The must go for 10 cents in volume.
Damn! Nobody sent me a calendar! I'm going to have to buy one. Or not.
$ cal 1 2025
January 2025
Su Mo Tu We Th Fr Sa
1 2 3 4
5 6 7 8 9 10 11
12 13 14 15 16 17 18
19 20 21 22 23 24 25
26 27 28 29 30 31
Still works!
you can bring peace to Ukraine & the Middle East (other projects to be announced after you've ticked those 2 off).
On 1/1/26 14:00, Lawrence DrCOOliveiro wrote:
On Thu, 1 Jan 2026 10:30:54 -0000 (UTC), Waldek Hebisch wrote:
But are 'expert systems' really AI?
What is really rCLAIrCY? At one point, the argument was over whether
computers could rCLthinkrCY. Then you had to define rCLthinkingrCY, and
somebody tried to settle the question by saing: rCLthinking is what
computers cannot dorCY.
The only succinct definition of rCLAIrCY I ever saw was: rCLsolving NP
problems in polynomial timerCY.
-a Kinda complex.
-a "AI" is generally understood as an "electronic human",
-a delivers very similar results. The exact MEANS is
-a irrelevant.
-a "Expert systems", kind of an 80's thing, were VERY
-a limited - basically lots of if/then/else constructs.
-a This WAS good enough for a lot of needs however,
-a still IS.
On 1/2/26 06:10, The Natural Philosopher wrote:
On 02/01/2026 02:18, Charlie Gibbs wrote:
Similarly, politicians dream of re-arranging laws (and adding more,
of course, never repealing) in pursuit of the dream that the right
combination of legislation will result in Paradise.
You really think that they do?
In reality they would prefer to take the salary and the perks and do
fuck all. The best ones.
The worst ones are those with Big Beautiful Ideas.
Most problems that haven't been solved already are not amenable to
political interference anyway: the best thing is to give people the
freedom to sort them, themselves.
-a Famous case - O.J.Simpson ... found NOT guilty by a
-a criminal court - but subsequently sued out of all
-a his assets by a civil jury. We also see biz cases
-a like for glyphosphate weed killer. STILL ads on
-a the TV by legal firms out to exploit THAT. "Did
-a you EVER use this ? Are you sick from ANYTHING ?
-a Then we'll score a MILLION for you ! Just call ..."
-a The only defense for biz is to delay, delay, delay.
-a The lawyers make big $$$ in any case.
On 1/2/26 00:44, c186282 wrote:
On 1/1/26 14:00, Lawrence DrCOOliveiro wrote:
On Thu, 1 Jan 2026 10:30:54 -0000 (UTC), Waldek Hebisch wrote:
But are 'expert systems' really AI?
What is really rCLAIrCY? At one point, the argument was over whether
computers could rCLthinkrCY. Then you had to define rCLthinkingrCY, and
somebody tried to settle the question by saing: rCLthinking is what
computers cannot dorCY.
The only succinct definition of rCLAIrCY I ever saw was: rCLsolving NP
problems in polynomial timerCY.
-a-a Kinda complex.
-a-a "AI" is generally understood as an "electronic human",
-a-a delivers very similar results. The exact MEANS is
-a-a irrelevant.
-a-a "Expert systems", kind of an 80's thing, were VERY
-a-a limited - basically lots of if/then/else constructs.
-a-a This WAS good enough for a lot of needs however,
-a-a still IS.
Where an Expert System shines is doing all the steps a human expert
does, but not missing any.
On Thu, 1 Jan 2026 10:30:54 -0000 (UTC), Waldek Hebisch wrote:
But are 'expert systems' really AI? Theoretically so called expert
system shells could do smart things, but examples I saw were essentially
a bunch of "if ... then ..." which could be written in almost any
programming language. One example of samewhat succesful 'expert system'
is supposed to guide a user trough installing Unix. Description
suggests that is is not smarter than modern Debian installer. And
nobody thinks that Debian installer is AI.
I never thought so. Like you I've looked at Lisp and Prolog and came away with the thought 'you *could* use that approach but why would you? It adds nothing to C but obfuscation.'
I don't think they call it an expert system but Arch Linux has a very detailed description of installing the system. There is also a sketchily maintained script that automates much of the process although the 'I use Arch btw' crowd considers that cheating. Then there is EndeavourOS and a couple of others that act like Debian, Ubuntu, or other installers and install Arch, throwing in several useful tools.
Then there was 'fuzzy logic' that had its day although you don't hear much about it lately. Perhaps it was overtaken by neural networks.
During--
training of a NN in successive iterations you calculate the loss function until you reach a point where it's 'good enough'. That technology is interesting that while you can define and explain each mathematical operation what's going on in the total sum is cloudy.
If you really needed 100 variant record in Turbo Pascal,
then you needed 100 unions in C.
No. You simply used *casting* .
k=*(int *)(buffer +4) etc etc.
On 2 Jan 2026 06:01:53 GMT, Ted Nolan <tednolan> wrote:
Turbo Pascal for CP/M-86 could access the graphics hardware on the
DEC Rainbow. A niche to be sure, but one my CSCI graphics class did
its projects in.
Did it have its own custom drivers for direct hardware access? Or did
it work through the rCLGSXrCY (GKS-superset) graphics library from Digital >Research?
On Thu, 1 Jan 2026 10:30:54 -0000 (UTC), Waldek Hebisch wrote:
But are 'expert systems' really AI?
What is really rCLAIrCY? At one point, the argument was over whether computers could rCLthinkrCY. Then you had to define rCLthinkingrCY, and somebody tried to settle the question by saing: rCLthinking is what
computers cannot dorCY.
No. You simply used *casting* .
k=*(int *)(buffer +4) etc etc.
You do have to be careful with this as it's not guaranteed that the
compiler won't take liberties in arranging members of a struct for >optimization purposes, ...
Considering early 'structured' langs like Algol/Pascal,
USUALLY you can structure things to cope with any prob.
However sometimes, well, 'perfect' structure for that
may take WAY longer than you can afford to invest, so
some 'cheats' may have to be introduced. CompSci people
won't understand that reality.
You mean 'expert system' coded in Lisp or Prolog? Or just general
coding in Lisp or Prolog? Concerning general coding IMO Prolog is
great for backtracking search and a few similar problem, but not good
for most of programs. On the other hand Lisp is quite capable general purpose language.
In alt.folklore.computers rbowman <bowman@montana.com> wrote:
On Thu, 1 Jan 2026 10:30:54 -0000 (UTC), Waldek Hebisch wrote:
But are 'expert systems' really AI? Theoretically so called expert
system shells could do smart things, but examples I saw were essentially >>> a bunch of "if ... then ..." which could be written in almost any
programming language. One example of samewhat succesful 'expert system' >>> is supposed to guide a user trough installing Unix. Description
suggests that is is not smarter than modern Debian installer. And
nobody thinks that Debian installer is AI.
I never thought so. Like you I've looked at Lisp and Prolog and came away
with the thought 'you *could* use that approach but why would you? It adds >> nothing to C but obfuscation.'
You mean 'expert system' coded in Lisp or Prolog? Or just general
coding in Lisp or Prolog? Concerning general coding IMO Prolog
is great for backtracking search and a few similar problem, but
not good for most of programs. On the other hand Lisp is quite
capable general purpose language.
I don't think they call it an expert system but Arch Linux has a very
detailed description of installing the system. There is also a sketchily
maintained script that automates much of the process although the 'I use
Arch btw' crowd considers that cheating. Then there is EndeavourOS and a
couple of others that act like Debian, Ubuntu, or other installers and
install Arch, throwing in several useful tools.
Then there was 'fuzzy logic' that had its day although you don't hear much >> about it lately. Perhaps it was overtaken by neural networks.
I looked a bit at 'fuzzy logic'. But I did not see more in it than
principle "if you do not know better, then use crude approximation".
This principle is resonable, but I did not see any reason to prefer
specific crude approximations advocated in various texts (with
approximation varying depending on the text).
During
training of a NN in successive iterations you calculate the loss function
until you reach a point where it's 'good enough'. That technology is
interesting that while you can define and explain each mathematical
operation what's going on in the total sum is cloudy.
Pascal was not a 'theoretical' lang ... Prof Nick actually meant it
to WORK in the real world.
On 02/01/2026 02:18, Charlie Gibbs wrote:
Similarly, politicians dream of re-arranging laws (and adding more,
of course, never repealing) in pursuit of the dream that the right
combination of legislation will result in Paradise.
You really think that they do?
In reality they would prefer to take the salary and the perks and do
fuck all. The best ones.
The worst ones are those with Big Beautiful Ideas.
May I be the first to welcome you back to the start of last year, I hope
you can bring peace to Ukraine & the Middle East (other projects to be announced after you've ticked those 2 off).
On Fri, 2 Jan 2026 10:59:55 +0000
The Natural Philosopher <tnp@invalid.invalid> wrote:
If you really needed 100 variant record in Turbo Pascal,
then you needed 100 unions in C.
No. You simply used *casting* .
k=*(int *)(buffer +4) etc etc.
You do have to be careful with this as it's not guaranteed that the
compiler won't take liberties in arranging members of a struct for optimization purposes, and any means to ensure that it doesn't are implementation-specific, so assumptions about casting a block of memory
to one struct/array or another can lead to portability issues...
...but boy, is it handy in a pinch!
On 02/01/2026 04:35, c186282 wrote:
NNs are 'different'. Not 'expert', not 'fuzzy', not LLM.
-a A little closer to how biological brains work. The bitch has been
-a finding suitable elements that can be compactly put on chips.
-a They're getting better at that. Maybe 10 years and decently good
-a 'AI' will fit INSIDE a bot instead of a 20 acre gigawatt data
-a center.
Yes. They are ultimately pattern recognition engines.
Trouble with those is you have to get the gain right, I cant remember
what happened to that software you fed images too and it turned them
into eyes, and dogs where there used to be plants. Because it tried too
hard.
Great fun
TP was a TOTAL REVOLUTION ... not only because of the integrated
development environment but because of the BLAZING compilation speed.
On Fri, 2 Jan 2026 13:06:34 +0000, Kerr-Mudd, John wrote:
May I be the first to welcome you back to the start of last year, I hope
you can bring peace to Ukraine & the Middle East (other projects to be
announced after you've ticked those 2 off).
Well, at least I wasn't writing a check... My ideas for peace in the Ukraine and the Middle East would be very unpopular.
In article <10j7qap$6ptq$1@dont-email.me>,
Lawrence D|+Oliveiro <ldo@nz.invalid> wrote:
On 2 Jan 2026 06:01:53 GMT, Ted Nolan <tednolan> wrote:
Turbo Pascal for CP/M-86 could access the graphics hardware on the
DEC Rainbow. A niche to be sure, but one my CSCI graphics class did
its projects in.
Did it have its own custom drivers for direct hardware access? Or did
it work through the |ore4+oGSX|ore4-Y (GKS-superset) graphics library from Digital
Research?
At this remove, I have no idea. And I never understood all the math,
so I was the guy in the team who wrote the CLI to interpret our
made up command language instead of doing the projections or whatever..
At least with a classifier it's easy to see a problem if it calls a Great Dane a horse but LLM fantasies tend to get accepted as facts.
Rational materialism of Western Science IS ultimately *religion*. Its
based on an unwarranted assumption about the nature of ourselves, and
the world, that we adhere to ultimately because, as Richard Dawkins
said 'It works, bitches'.
On 2026-01-01, Lawrence DrCOOliveiro <ldo@nz.invalid> wrote:
On Thu, 1 Jan 2026 10:30:54 -0000 (UTC), Waldek Hebisch wrote:
But are 'expert systems' really AI?
What is really rCLAIrCY? At one point, the argument was over whether
computers could rCLthinkrCY. Then you had to define rCLthinkingrCY, and
somebody tried to settle the question by saing: rCLthinking is what
computers cannot dorCY.
"The question of whether a computer can think is no more interesting
than the question of whether a submarine can swim." - Edsger Dijkstra
Mr. Dijkstra had his issues, but I'd say he hit the nail on the head
there.
According to John Ames <commodorejohn@gmail.com>:
No. You simply used *casting* .
k=*(int *)(buffer +4) etc etc.
You do have to be careful with this as it's not guaranteed that the
compiler won't take liberties in arranging members of a struct for
optimization purposes, ...
No, the C Standard says:
Within a structure object, the non-bit-field members and the units in which bit-fields
reside have addresses that increase in the order in which they are declared.
There can be bits of padding to get fields aligned as needed, but no reordering.
It is pretty common to use structure declarations with common fields at the front to do varriant records.
According to c186282 <c186282@nnada.net>:
Considering early 'structured' langs like Algol/Pascal,
USUALLY you can structure things to cope with any prob.
However sometimes, well, 'perfect' structure for that
may take WAY longer than you can afford to invest, so
some 'cheats' may have to be introduced. CompSci people
won't understand that reality.
Hi, Comp Sci PhD here. We understand that just fine, although
some of us try harder than others to match theory to reality.
Algol60 had nice loops and nested scopes but it also had gotos.
Comp Sci like any other field can be very trendy. When I was
in school it was fashionable to say bad things about COBOL even
though hardly anyone actually knew what COBOL was like. I found
a compiler and wrote one small program to find out that yes it
is wordy but it also had better data structuring than a lot of
more fashionable languages.
On Fri, 2 Jan 2026 02:40:13 -0500, c186282 wrote:
Pascal was not a 'theoretical' lang ... Prof Nick actually meant it
to WORK in the real world.
I disagree with that. Wirth was mostly concerned with constructing
didactic languages. The joke about the original implementatino was it is a good language for telling itself secrets since there is no i/o.
Students learned it and extended it when they had to use it in the real world. Lisp has a similar history. Common Lisp and its descendants violate the purity of the Lisp concept but get things done.
On Fri, 2 Jan 2026 11:13:59 +0000, The Natural Philosopher wrote:
On 02/01/2026 04:35, c186282 wrote:
NNs are 'different'. Not 'expert', not 'fuzzy', not LLM.
-a A little closer to how biological brains work. The bitch has been
-a finding suitable elements that can be compactly put on chips.
-a They're getting better at that. Maybe 10 years and decently good
-a 'AI' will fit INSIDE a bot instead of a 20 acre gigawatt data
-a center.
Yes. They are ultimately pattern recognition engines.
Trouble with those is you have to get the gain right, I cant remember
what happened to that software you fed images too and it turned them
into eyes, and dogs where there used to be plants. Because it tried too
hard.
Great fun
The 'hello world' of image recognition is classify dogs and cats. There is
a very large dataset of cat and dog images to work with.
One of the early problems was the dogs tended to be photographed outside
and the cats inside. After training the model was very good in classifying furry animals in an outside setting versus those inside.
Speaking in an anthropomorphic way classifiers can have acceptable
behavior but you're never too sure exactly what they're 'thinking'. There
is a whole field of research trying to figure out what the hell goes on in the black box.
At least with a classifier it's easy to see a problem if it calls a Great Dane a horse but LLM fantasies tend to get accepted as facts.
On 1/2/26 13:26, John Levine wrote:
No, the C Standard says:
Within a structure object, the non-bit-field members and the units in which bit-fields
reside have addresses that increase in the order in which they are declared.
There can be bits of padding to get fields aligned as needed, but no reordering.
It is pretty common to use structure declarations with common fields at the >> front to do varriant records.
Hmmm ... with arrays of simple types you can
advance the pointer by 'x' and get the 'x'-th
element. In theory you can manually peek 'x'
(times type) bytes ahead in memory too.
Is that not for-sure correct with variant records ?
length = nchars;memcpy((void *)p->data, (void *)srcstring, strlen(srcstring));
I *like* to make 'perfect' structuring that will
handle anything, but at times there was time pressure
to "make it work" and I could not spend days/weeks
trying to get it 'just perfect'.
Sometimes coming BACK to it in a month or two will
yield sudden inspiration however ... I think the
annoying problem hides in the back of the mind
for a long time and gets at least some 'cpu cycles'
even if you don't realize.
On 2026-01-01, rbowman <bowman@montana.com> wrote:
On Thu, 01 Jan 2026 19:12:29 +0000, Richard Kettlewell wrote:
Lawrence DrCOOliveiro <ldo@nz.invalid> writes:
Waldek Hebisch wrote:
But are 'expert systems' really AI?
What is really rCLAIrCY? At one point, the argument was over whether
computers could rCLthinkrCY. Then you had to define rCLthinkingrCY, and >>>> somebody tried to settle the question by saing: rCLthinking is what
computers cannot dorCY.
The only succinct definition of rCLAIrCY I ever saw was: rCLsolving NP >>>> problems in polynomial timerCY.
It was always rather flexible. Currently itrCOs a label you put on things >>> to attract venture capital or other forms of finance.
Best definition yet. It's already started with the 'smart' phone but I'm
waiting for the marketers of consumer goods to tack AI onto frying pans
and everything else.
"If it can done, it should be done." That's one of a collection of sayings that someday I'll compile into an essay titled "Memes that Will Destroy the World".
Back when electronics became cheap, remember how clocks were incorporated into just about everything? I had a ball-point pen with a clock in it.
It wasn't very smart but it was sad to see Roomba go under. If nothing
else it was good for terrorizing cats.
I won't ever have a smart speaker, and I'll be damned if I'm going
to have a vacuum cleaner that cases the joint and reports back to
the mother ship. Besides, I have better ways to entertain the cats.
"The question of whether a computer can think is no more interesting
than the question of whether a submarine can swim." - Edsger Dijkstra
Mr. Dijkstra had his issues, but I'd say he hit the nail on the head
there.
The need for symmetric encryption systems to include a MAC to
prevent this kind of issue has been understood for a long time.
On 1/2/26 08:06, Kerr-Mudd, John wrote:
On 2 Jan 2026 06:32:41 GMT
rbowman <bowman@montana.com> wrote:
May I be the first to welcome you back to the start of last year, I hope
you can bring peace to Ukraine & the Middle East (other projects to be
announced after you've ticked those 2 off).
-a Hmmm ... this DOES seem to be a year-old theme ... maybe
-a something stuck in his outbox ?
-a I too bought a calculator way back then, but for $50 in
-a 70s money. It STILL WORKS. The more expensive TI programmable
-a scientific I bought shortly after, the chikky keys crapped
-a out in less than a year.
-a I do remember the 'clock craze' ... as soon as the super
-a cheap nano-power clock chips came out EVERYTHING seemed
-a to have a digital clock built in.
And maybe two years later my Canadian cousin handed me down his TI
58C. Magnificent calculator, but same problem that actually made me
fail an exam or two on Uni.
On 1/2/26 12:14, Ted Nolan <tednolan> wrote:
In article <10j7qap$6ptq$1@dont-email.me>,from Digital
Lawrence D|+Oliveiro <ldo@nz.invalid> wrote:
On 2 Jan 2026 06:01:53 GMT, Ted Nolan <tednolan> wrote:
Turbo Pascal for CP/M-86 could access the graphics hardware on the
DEC Rainbow. A niche to be sure, but one my CSCI graphics class did
its projects in.
Did it have its own custom drivers for direct hardware access? Or did
it work through the |ore4+oGSX|ore4-Y (GKS-superset) graphics library
Research?
At this remove, I have no idea. And I never understood all the math,
so I was the guy in the team who wrote the CLI to interpret our
made up command language instead of doing the projections or whatever..
Foley & Van Dam ... "Fundamentals Of Interactive
Computer Graphics".
All the example code is in Pascal.
Richard Kettlewell wrote:
The need for symmetric encryption systems to include a MAC to
prevent this kind of issue has been understood for a long time.
EfAaEfA>
Just want to point out you used the term rCLsymmetricrCY in the sense in which I think it *should* be used: to refer to encryption systems
where the encryption and decryption algorithms are one and the same.
Too often the term is used to refer to systems where the same key is
used for encryption and decryption -- I think these should more
properly be called rCLsecret-keyrCY systems.
On 1/2/26 14:29, rbowman wrote:
On Fri, 2 Jan 2026 02:40:13 -0500, c186282 wrote:
-a-a-a Pascal was not a 'theoretical' lang ... Prof Nick actually meant it >>> -a-a-a to WORK in the real world.
I disagree with that. Wirth was mostly concerned with constructing
didactic languages. The joke about the original implementatino was it
is a
good language for telling itself secrets since there is no i/o.
-a Must have been a damned early version.
-a Old ALGOL had no I/O however. Didn't show up
-a until what, '68 ?
Students learned it and extended it when they had to use it in the real
world. Lisp has a similar history. Common Lisp and its descendants
violate
the purity of the Lisp concept but get things done.
-a "There's nothing pure in this world ..."
... it's not guaranteed that the compiler won't take liberties in
arranging members of a struct for optimization purposes ...
On 1/2/26 14:29, rbowman wrote:
On Fri, 2 Jan 2026 02:40:13 -0500, c186282 wrote:
-a-a-a Pascal was not a 'theoretical' lang ... Prof Nick actually meant it >>> -a-a-a to WORK in the real world.
I disagree with that. Wirth was mostly concerned with constructing
didactic languages. The joke about the original implementatino was it
is a
good language for telling itself secrets since there is no i/o.
-a Must have been a damned early version.
-a Old ALGOL had no I/O however. Didn't show up
-a until what, '68 ?
On Fri, 2 Jan 2026 08:49:25 -0800, John Ames wrote:
... it's not guaranteed that the compiler won't take liberties in
arranging members of a struct for optimization purposes ...
The C23 spec (section 6.2.5, rCLTypesrCY) does say the member objects of a struct type need to be rCLsequentially allocatedrCY. The only freedom the compiler has (section 6.2.6) is to add rCLpadding bytesrCY.
On Fri, 2 Jan 2026 08:49:25 -0800, John Ames wrote:
... it's not guaranteed that the compiler won't take liberties in
arranging members of a struct for optimization purposes ...
The C23 spec (section 6.2.5, rCLTypesrCY) does say the member objects of a >struct type need to be rCLsequentially allocatedrCY.
On 1/2/26 13:18, c186282 wrote:
-a Old ALGOL had no I/O however. Didn't show up
-a until what, '68 ?
58 if you count Burroughs.
I went to check and, lo, it's in the C89 spec as well; as was already... it's not guaranteed that the compiler won't take liberties in
arranging members of a struct for optimization purposes ...
The C23 spec (section 6.2.5, rCLTypesrCY) does say the member objects of
a struct type need to be rCLsequentially allocatedrCY.
That language has been there a long time. It's in my copy of C11 and
it wasn't new then. It's probably always been there since we wrote
code that used the common struct prefix hack in K&R C.
C doesn't have variant records, but you can fake them with structures
with common initial fields. The different structures can be different
sizes so the usual approach is to malloc() them one at a time and use a pointer to it.
I *like* to make 'perfect' structuring that will handle anything, but
at times there was time pressure to "make it work" and I could not
spend days/weeks trying to get it 'just perfect'.
I like the idea of a robot that actually cleans the house.
On 2 Jan 2026 17:41:34 GMT, Niklas Karlsson wrote:
"The question of whether a computer can think is no more interesting
than the question of whether a submarine can swim." - Edsger Dijkstra
Mr. Dijkstra had his issues, but I'd say he hit the nail on the head
there.
Sometimes I think he managed to make a career out of trolling ...
On 1/2/26 14:33, Lawrence DrCOOliveiro wrote:
On Fri, 2 Jan 2026 08:49:25 -0800, John Ames wrote:
... it's not guaranteed that the compiler won't take liberties in
arranging members of a struct for optimization purposes ...
The C23 spec (section 6.2.5, rCLTypesrCY) does say the member objects of a >> struct type need to be rCLsequentially allocatedrCY. The only freedom the
compiler has (section 6.2.6) is to add rCLpadding bytesrCY.
It defeats the purpose of a structure if the compiler is free to
rearrange it. Local variables (PL/I AUTOMATIC) can, in most languages,
be stored however the compiler wants.
On 02/01/2026 19:46, rbowman wrote:
At least with a classifier it's easy to see a problem if it calls a
Great Dane a horse but LLM fantasies tend to get accepted as facts.
To a young child, if its got 4 legs and fur, its a 'doggie'.
On 02/01/2026 19:36, rbowman wrote:
On Fri, 2 Jan 2026 13:06:34 +0000, Kerr-Mudd, John wrote:
May I be the first to welcome you back to the start of last year, I
hope you can bring peace to Ukraine & the Middle East (other projects
to be announced after you've ticked those 2 off).
Well, at least I wasn't writing a check... My ideas for peace in the
Ukraine and the Middle East would be very unpopular.
Probably with its inhabitants, yes.
On Fri, 2 Jan 2026 15:27:42 -0000 (UTC), Waldek Hebisch wrote:
You mean 'expert system' coded in Lisp or Prolog? Or just general
coding in Lisp or Prolog? Concerning general coding IMO Prolog is
great for backtracking search and a few similar problem, but not good
for most of programs. On the other hand Lisp is quite capable general
purpose language.
I meant the general case. There were precedents and other people involved
in the evolution but as shorthand I'll say Lisp embodies the was McCarthy thinks, and Prolog does the same for Roussel. At that point it gets philosophical. How does a person structure and perceive reality? In
another thread Hume and Kant came up. Hume triggered Kant's thought but
Kant approached the world differently.
Less esoterically, I looked at Lisp, or Scheme precisely in the Wizard
book, and I could understand the concepts and follow the thought processes but they were not a natural approach for the way i address the world. I didn't mean a general indictment of the language when I said I couldn't understand why someone would do it that way, but a very specific *I*.
Prolog is either further away for me. It sounds contradictory but I'm logical but never was comfortable with formal logic.--
On 1/2/26 13:18, c186282 wrote:
-a Old ALGOL had no I/O however. Didn't show up
-a until what, '68 ?
On 2026-01-02, Peter Flass <Peter@Iron-Spring.com> wrote:
58 if you count Burroughs.
For me, "old Algol" means Algol-60 as opposed to Algol-68.
So how could Burroughs have it in '58?
On Fri, 2 Jan 2026 20:32:53 -0000 (UTC), John Levine wrote:
C doesn't have variant records, but you can fake them with structures
with common initial fields. The different structures can be different
sizes so the usual approach is to malloc() them one at a time and use a
pointer to it.
Another approach is to have a struct containing union of structs with a
flag in the top level struct indicating which child struct to use in the >union. The structs in the union can also have unions so you can build a
real octopus.
On 1/2/26 13:18, c186282 wrote:
-a Old ALGOL had no I/O however. Didn't show up
-a until what, '68 ?
On 2026-01-02, Peter Flass <Peter@Iron-Spring.com> wrote:
58 if you count Burroughs.
For me, "old Algol" means Algol-60 as opposed to Algol-68.
So how could Burroughs have it in '58?
On Fri, 2 Jan 2026 15:00:07 -0700, Peter Flass wrote:
On 1/2/26 14:33, Lawrence DrCOOliveiro wrote:
On Fri, 2 Jan 2026 08:49:25 -0800, John Ames wrote:
... it's not guaranteed that the compiler won't take liberties in
arranging members of a struct for optimization purposes ...
The C23 spec (section 6.2.5, rCLTypesrCY) does say the member objects of a >>> struct type need to be rCLsequentially allocatedrCY. The only freedom the >>> compiler has (section 6.2.6) is to add rCLpadding bytesrCY.
It defeats the purpose of a structure if the compiler is free to
rearrange it. Local variables (PL/I AUTOMATIC) can, in most languages,
be stored however the compiler wants.
That can lead to interesting bugs. The root cause is overflowing a local variable, say writing 6 characters to a char[4]. Which adjacent local variable gets whacked depends on the compiler's ordering. Whether it manifests as a bug depends on how the corrupt variable is used in the function and where it is initialized.
Lisp has garbage collection, so no need to manualy free memory.
Which means that you can build new things at any time without risk
of leaking memory.
But it was noticed that Lisp sources can be transformed under
program control and such transformations are easy because the Lisp
source has the same form as Lisp data. Anyway, this capability is
frequently used and support for it is main reason to keep
parenthesised notation.
On 1/2/26 17:52, rbowman wrote:
On Fri, 2 Jan 2026 15:00:07 -0700, Peter Flass wrote:
On 1/2/26 14:33, Lawrence DrCOOliveiro wrote:
On Fri, 2 Jan 2026 08:49:25 -0800, John Ames wrote:
... it's not guaranteed that the compiler won't take liberties in
arranging members of a struct for optimization purposes ...
The C23 spec (section 6.2.5, rCLTypesrCY) does say the member objects of a >>>> struct type need to be rCLsequentially allocatedrCY. The only freedom the >>>> compiler has (section 6.2.6) is to add rCLpadding bytesrCY.
It defeats the purpose of a structure if the compiler is free to
rearrange it. Local variables (PL/I AUTOMATIC) can, in most languages,
be stored however the compiler wants.
That can lead to interesting bugs. The root cause is overflowing a local
variable, say writing 6 characters to a char[4]. Which adjacent local
variable gets whacked depends on the compiler's ordering. Whether it
manifests as a bug depends on how the corrupt variable is used in the
function and where it is initialized.
Indeed. A few times I suspected this I put a character string before and after where I suspected the problem was, and did lots of checking to
find out where it was being clobbered.
Ummm ... not so sure anymore. LLMs *are* showing
signs of "self" (and self-preservation) already.
Lawrence DrCOOliveiro <ldo@nz.invalid> writes:
Richard Kettlewell wrote:
The need for symmetric encryption systems to include a MAC to
prevent this kind of issue has been understood for a long time.
EfAaEfA>
Just want to point out you used the term rCLsymmetricrCY in the sense in
which I think it *should* be used: to refer to encryption systems
where the encryption and decryption algorithms are one and the same.
Too often the term is used to refer to systems where the same key is
used for encryption and decryption -- I think these should more
properly be called rCLsecret-keyrCY systems.
Please stop trolling.
(For anyone in doubt, symmetric encryption refers to single-key
encryption schemes, not encryption schemes were encryption and
decryption are the same operation.)
I certainly studied i/o in *what they told us* was standard pascal,
using the original Wirth book.
On Fri, 2 Jan 2026 19:58:48 +0000, The Natural Philosopher wrote:
On 02/01/2026 19:46, rbowman wrote:
At least with a classifier it's easy to see a problem if it calls a
Great Dane a horse but LLM fantasies tend to get accepted as facts.
To a young child, if its got 4 legs and fur, its a 'doggie'.
To an even younger child it's close to a Ding an sich. 'Doggie' already is
a departure from immediate reality. Mommy intrudes and says 'No it is a
cat.' Later Mommy adds the concept of 'two cats' and we're off to the
races. Eventually the kid gets a PhD in math and lives in a completely abstract world unable to make a pot of coffee.
On Fri, 2 Jan 2026 19:57:06 +0000, The Natural Philosopher wrote:
On 02/01/2026 19:36, rbowman wrote:
On Fri, 2 Jan 2026 13:06:34 +0000, Kerr-Mudd, John wrote:
May I be the first to welcome you back to the start of last year, I
hope you can bring peace to Ukraine & the Middle East (other projects
to be announced after you've ticked those 2 off).
Well, at least I wasn't writing a check... My ideas for peace in the
Ukraine and the Middle East would be very unpopular.
Probably with its inhabitants, yes.
My real solution would be sort of a holmgang. Let them sort their shit out with no outside interference. May the best Slovak or Semite win.
On Fri, 2 Jan 2026 19:57:06 +0000, The Natural Philosopher wrote:
On 02/01/2026 19:36, rbowman wrote:
On Fri, 2 Jan 2026 13:06:34 +0000, Kerr-Mudd, John wrote:
May I be the first to welcome you back to the start of last year, I
hope you can bring peace to Ukraine & the Middle East (other projects
to be announced after you've ticked those 2 off).
Well, at least I wasn't writing a check... My ideas for peace in the
Ukraine and the Middle East would be very unpopular.
Probably with its inhabitants, yes.
My real solution would be sort of a holmgang. Let them sort their shit out with no outside interference. May the best Slovak or Semite win.
Well, in case Ukraine part of it is deciding what is inside and what
is outside. And if you say that inside is within borders of
Ukraine, then essentialy you say that Russia should stop messing
in Ukrainian matters. Good luck convincing Russia to do so.
On 02/01/2026 21:17, Richard Kettlewell wrote:
Lawrence DrCOOliveiro <ldo@nz.invalid> writes:I am surprised you didn't kf him years ago.
Richard Kettlewell wrote:Please stop trolling.
The need for symmetric encryption systems to include a MAC to
prevent this kind of issue has been understood for a long time.
EfAaEfA>
Just want to point out you used the term rCLsymmetricrCY in the sense in >>> which I think it *should* be used: to refer to encryption systems
where the encryption and decryption algorithms are one and the same.
Too often the term is used to refer to systems where the same key is
used for encryption and decryption -- I think these should more
properly be called rCLsecret-keyrCY systems.
(For anyone in doubt, symmetric encryption refers to single-key
encryption schemes, not encryption schemes were encryption and
decryption are the same operation.)
According to rbowman <bowman@montana.com>:
On Fri, 2 Jan 2026 20:32:53 -0000 (UTC), John Levine wrote:
C doesn't have variant records, but you can fake them with structures
with common initial fields. The different structures can be different
sizes so the usual approach is to malloc() them one at a time and use a
pointer to it.
Another approach is to have a struct containing union of structs with a >>flag in the top level struct indicating which child struct to use in the >>union. The structs in the union can also have unions so you can build a >>real octopus.
That works but the union is the size of the largest struct so it can waste
a lot of space compared to allocating each struct's actual size. I realize this doesn't work for arrays of structs or unions, but it works fine for arrays of pointers to them.
On Fri, 2 Jan 2026 15:15:53 -0500, c186282 wrote:
I *like* to make 'perfect' structuring that will handle anything, but
at times there was time pressure to "make it work" and I could not
spend days/weeks trying to get it 'just perfect'.
We had a couple of programmers who tried to handle all possible eventualities. Typically the eventualities never evidenced or whatever the theoretical future does, leaving a very complex piece of code to do the
task at hand.
Solve tomorrow's problems tomorrow.
On Fri, 2 Jan 2026 21:47:01 +0100, Carlos E.R. wrote:
I like the idea of a robot that actually cleans the house.
https://petkit.com/products/purobot-ultra
I wonder how many people buy one of these? I think the cat's response--
would be "WTF? I ain't going in there."
In alt.folklore.computers rbowman <bowman@montana.com> wrote:
On Fri, 2 Jan 2026 19:57:06 +0000, The Natural Philosopher wrote:
On 02/01/2026 19:36, rbowman wrote:
On Fri, 2 Jan 2026 13:06:34 +0000, Kerr-Mudd, John wrote:
May I be the first to welcome you back to the start of last year, I
hope you can bring peace to Ukraine & the Middle East (other projects >>>>> to be announced after you've ticked those 2 off).
Well, at least I wasn't writing a check... My ideas for peace in the >>>> Ukraine and the Middle East would be very unpopular.
Probably with its inhabitants, yes.
My real solution would be sort of a holmgang. Let them sort their shit out >> with no outside interference. May the best Slovak or Semite win.
Well, in case Ukraine part of it is deciding what is inside and what
is outside. And if you say that inside is within borders of
Ukraine, then essentialy you say that Russia should stop messing
in Ukrainian matters. Good luck convincing Russia to do so.
It appears that c186282 <c186282@nnada.net> said:
On 1/2/26 13:26, John Levine wrote:
No, the C Standard says:
Within a structure object, the non-bit-field members and the units in which bit-fields
reside have addresses that increase in the order in which they are declared.
There can be bits of padding to get fields aligned as needed, but no reordering.
It is pretty common to use structure declarations with common fields at the >>> front to do varriant records.
Hmmm ... with arrays of simple types you can
advance the pointer by 'x' and get the 'x'-th
element. In theory you can manually peek 'x'
(times type) bytes ahead in memory too.
Is that not for-sure correct with variant records ?
C doesn't have variant records, but you can fake them with structures with >common initial fields. The different structures can be different sizes
so the usual approach is to malloc() them one at a time and use a pointer to it.
As bonus confusion, C allows the last field in a structure to be an array of >unspecified size, e.g.
struct countedstring {
int length;
char data[];
};
Then for a string of innitialized from srcstring you'd say something like:
struct countedstring *p = malloc(sizeof(struct countedstring) + strlen(srcstring));
length = nchars;memcpy((void *)p->data, (void *)srcstring, strlen(srcstring));
On Fri, 02 Jan 2026 02:18:42 GMT, Charlie Gibbs wrote:
Back when electronics became cheap, remember how clocks were
incorporated into just about everything? I had a ball-point pen with a
clock in it.
I used those little round stick-ons to keep track of project hours. When I >couldn't find one I bought a $5 wrist watch at a flea market. The
department manager advised me I shouldn't leave a valuable watch by the >monitor. At least a blue stick-on didn't look lile much.
A friend bought a very early calculator for several hundred 1970s dollars.
I must have pissed them all off but I have several calculators that were
in the begging letters from various organizations in lieu of mittens or >return address stickers. The must go for 10 cents in volume.
Damn! Nobody sent me a calendar! I'm going to have to buy one. Or not.
$ cal 1 2025
January 2025
Su Mo Tu We Th Fr Sa
1 2 3 4
5 6 7 8 9 10 11
12 13 14 15 16 17 18
19 20 21 22 23 24 25
26 27 28 29 30 31
Still works!
On 2026-01-03 01:27, rbowman wrote:
On Fri, 2 Jan 2026 21:47:01 +0100, Carlos E.R. wrote:
I like the idea of a robot that actually cleans the house.
https://petkit.com/products/purobot-ultra
:-D
I wonder how many people buy one of these? I think the cat's response
would be "WTF? I ain't going in there."
On 1/3/26 01:50, Waldek Hebisch wrote:
In alt.folklore.computers rbowman <bowman@montana.com> wrote:
On Fri, 2 Jan 2026 19:57:06 +0000, The Natural Philosopher wrote:
On 02/01/2026 19:36, rbowman wrote:
On Fri, 2 Jan 2026 13:06:34 +0000, Kerr-Mudd, John wrote:
May I be the first to welcome you back to the start of last year, I >>>>>> hope you can bring peace to Ukraine & the Middle East (other projects >>>>>> to be announced after you've ticked those 2 off).
Well, at least I wasn't writing a check...-a-a My ideas for peace in the >>>>> Ukraine and the Middle East would be very unpopular.
Probably with its inhabitants, yes.
My real solution would be sort of a holmgang. Let them sort their
shit out
with no outside interference. May the best Slovak or Semite win.
Well, in case Ukraine part of it is deciding what is inside and what
is outside.-a And if you say that inside is within borders of
Ukraine, then essentialy you say that Russia should stop messing
in Ukrainian matters.-a Good luck convincing Russia to do so.
We could have convinced them week one, except Biden was too spineless.
When the Russian invasion was pending we pulled all our people out. I
felt that we should have put more people in - not military forces per
se, but "advisors" and "trainers" imbedded with Ukrainian troops at the front lines. A Russian invasion would have had to push past our non- combatants to get anywhere, at which point we could have said "pull back
now or suffer the consequences. Make sure none of our people are harmed"
I once created an audio playback app with class hierarchies in C, rather
than C++. It was an interesting experiment, and it worked. But that's
the last time I tried that.
3 Thou shalt cast all function arguments to the expected type if
they are not of that type already, even when thou art convinced
that this is unnecessary, lest they take cruel vengeance upon thee
when thou least expect it.
No Low-Level Access: The language lacked a way to override its
strict type system, making it impossible to write its own I/O systems or memory allocators *within the language itself*.
We could have convinced them week one, except Biden was too spineless.
When the Russian invasion was pending we pulled all our people out. I
felt that we should have put more people in - not military forces per
se, but "advisors" and "trainers" imbedded with Ukrainian troops at the
front lines.
And whether the variable is followed by some padding. If that char[4] variable is followed by, say, 4 bytes of padding, you can write up to 8
bytes to it and not feel a thing. Then comes the day when you try to
write 9 bytes there and kaboom. I've lost a lot of hair with those
ones,
when a program that's run fine for a couple of years suddenly dies.
On 1/3/26 08:24, Carlos E.R. wrote:
On 2026-01-03 01:27, rbowman wrote:
On Fri, 2 Jan 2026 21:47:01 +0100, Carlos E.R. wrote:
I like the idea of a robot that actually cleans the house.
https://petkit.com/products/purobot-ultra
:-D
I wonder how many people buy one of these? I think the cat's response
would be "WTF? I ain't going in there."
There was one variant of those recalled - they were not good at
telling when the cat LEFT the thing and then started agitating .....
On 03/01/2026 01:03, rbowman wrote:
On Fri, 2 Jan 2026 19:58:48 +0000, The Natural Philosopher wrote:
On 02/01/2026 19:46, rbowman wrote:
At least with a classifier it's easy to see a problem if it calls a
Great Dane a horse but LLM fantasies tend to get accepted as facts.
To a young child, if its got 4 legs and fur, its a 'doggie'.
To an even younger child it's close to a Ding an sich. 'Doggie' already
is a departure from immediate reality. Mommy intrudes and says 'No it
is a cat.' Later Mommy adds the concept of 'two cats' and we're off to
the races. Eventually the kid gets a PhD in math and lives in a
completely abstract world unable to make a pot of coffee.
Oh, you met him?
I periodically reset it to see if any of its denizens have given their
heads a wobble yet. Sometimes they have, sometimes they havenrCOt.
On Sat, 03 Jan 2026 06:09:32 GMT, Charlie Gibbs wrote:
And whether the variable is followed by some padding. If that char[4]
variable is followed by, say, 4 bytes of padding, you can write up to 8
bytes to it and not feel a thing. Then comes the day when you try to
write 9 bytes there and kaboom. I've lost a lot of hair with those
ones,
when a program that's run fine for a couple of years suddenly dies.
I have fixed bugs that were old enough to vote. Like the organisms in the permafrost in the plot lines of 'The Last ship' and 'Fortitude' they lay there in wait...
30 years ago programmers were very stingy with allocations.
On 03/01/2026 01:03, rbowman wrote:
To an even younger child it's close to a Ding an sich. 'Doggie' already
is a departure from immediate reality. Mommy intrudes and says 'No it
is a cat.' Later Mommy adds the concept of 'two cats' and we're off to
the races. Eventually the kid gets a PhD in math and lives in a
completely abstract world unable to make a pot of coffee.
On Sat, 3 Jan 2026 08:34:22 +0000, The Natural Philosopher wrote:
Oh, you met him?
Several times. A PhD friend of mine was in a minor car accident. He
admitted he was thinking of something rather than staying on his side of
the road. Despite the degree being in electronics I watched him short out
a car battery with a piece of 14 gauge wire. I'm sure he could have done a complete circuit analysis of why it vaporized.
(who was our company's head accountant) complained about some of the
software engineers forgetting to cash their paychecks for up to 6
months at a time.
The husband of one of my ex-wife's coworkers usually had a book open on
the steering wheel as he was driving around town by himself. His wife
(who was our company's head accountant) complained about some of the
software engineers forgetting to cash their paychecks for up to 6 months
at a time. I was never anywhere that flush with cash, and periodically
did a home equity loan or a refinance to get cash out to pay off the
credit cards.
According to Lars Poulsen <lars@beagle-ears.com>:
(who was our company's head accountant) complained about some of the >>software engineers forgetting to cash their paychecks for up to 6 months
at a time.
Dennis Ritchie apparently failed to cash so many paychecks that one time
they voided all the old checks, wrote one big new one, and then had
someone walk him over to the bank and be sure he deposited it.
On Sat, 3 Jan 2026 07:03:38 -0500, Chris Ahlstrom wrote:
I once created an audio playback app with class hierarchies in C, rather
than C++. It was an interesting experiment, and it worked. But that's
the last time I tried that.
A class is a glorified struct. I remember heated discussions at one of the Boston Computer Society's meeting before 'C++' became a name about 'C with Classes' and whether a new language was needed.
'C with Classes' is now a derogatory term that describes the sort of C++ I write. Charles Petzhold has written a number of books on programming for Windows. He has an intense dislike for C++ so if you can track down some
of the first editions of 'Programming Windows' they are all C. The 6th edition was C# which he said was what should have been all along.
The C approach was educational since it exposed some of the magic lke vtables, and the magical 'this' is only another parameter passed in the first location.
On Sat, 3 Jan 2026 07:11:19 -0500, Chris Ahlstrom wrote:
3 Thou shalt cast all function arguments to the expected type if
they are not of that type already, even when thou art convinced
that this is unnecessary, lest they take cruel vengeance upon thee
when thou least expect it.
Corollary: thou shalt be sparing in thy use of const lest future
generations curse thy name.
On Sat, 03 Jan 2026 06:09:32 GMT, Charlie Gibbs wrote:
And whether the variable is followed by some padding. If that char[4]
variable is followed by, say, 4 bytes of padding, you can write up to 8
bytes to it and not feel a thing. Then comes the day when you try to
write 9 bytes there and kaboom. I've lost a lot of hair with those
ones,
when a program that's run fine for a couple of years suddenly dies.
I have fixed bugs that were old enough to vote. Like the organisms in the permafrost in the plot lines of 'The Last ship' and 'Fortitude' they lay there in wait...
30 years ago programmers were very stingy with allocations.
On 2025-12-28, Carlos E.R. <robin_listas@es.invalid> wrote:
I have the rot13 program installed, and that means that I have used it
at some point.
tr a-zA-Z n-za-mN-ZA-M
will also do the job.
On Sat, 3 Jan 2026 07:03:38 -0500, Chris Ahlstrom wrote:
I once created an audio playback app with class hierarchies in C, rather
than C++. It was an interesting experiment, and it worked. But that's
the last time I tried that.
A class is a glorified struct. I remember heated discussions at one of the Boston Computer Society's meeting before 'C++' became a name about 'C with Classes' and whether a new language was needed.
'C with Classes' is now a derogatory term that describes the sort of C++ I write. Charles Petzhold has written a number of books on programming for Windows. He has an intense dislike for C++ so if you can track down some
of the first editions of 'Programming Windows' they are all C. The 6th edition was C# which he said was what should have been all along.
On 02/01/2026 21:22, Carlos E.R. wrote:
I certainly studied i/o in *what they told us* was standard pascal,
using the original Wirth book.
(a) What they tell you is not always true.
(b) What is 'standard' is a moveable feast...
...google sez...
"The statement "Pascal has no I/O" originates from
Brian KernighanrCOs 1981 essay, "Why Pascal is Not My Favorite Programming Language".
Kernighan argued that the original 1970 definition of Pascal was
severely limited for systems programming because:
-a-a-a No Low-Level Access: The language lacked a way to override its
strict type system, making it impossible to write its own I/O systems or memory allocators *within the language itself*.
-a-a-a Fixed Array Sizes: Because array size was part of the type, a function could not be written to handle strings or arrays of different lengths, complicating general-purpose file I/O.
-a-a-a Lack of Portability: Standard PascalrCOs I/O was considered "primitive," and any real-world use required implementation-specific extensions that broke portability between compilers."
On 2025-12-31, c186282 <c186282@nnada.net> wrote:
Can't wait to see what the AIs are cranking out in
a few years ... 29 levels all bunched together into
one gigantic line ? :-)
Shades of APL.
On Sat, 3 Jan 2026 08:34:22 +0000, The Natural Philosopher wrote:
On 03/01/2026 01:03, rbowman wrote:
On Fri, 2 Jan 2026 19:58:48 +0000, The Natural Philosopher wrote:
On 02/01/2026 19:46, rbowman wrote:
At least with a classifier it's easy to see a problem if it calls a
Great Dane a horse but LLM fantasies tend to get accepted as facts.
To a young child, if its got 4 legs and fur, its a 'doggie'.
To an even younger child it's close to a Ding an sich. 'Doggie' already
is a departure from immediate reality. Mommy intrudes and says 'No it
is a cat.' Later Mommy adds the concept of 'two cats' and we're off to
the races. Eventually the kid gets a PhD in math and lives in a
completely abstract world unable to make a pot of coffee.
Oh, you met him?
Several times. A PhD friend of mine was in a minor car accident. He
admitted he was thinking of something rather than staying on his side of
the road. Despite the degree being in electronics I watched him short out
a car battery with a piece of 14 gauge wire. I'm sure he could have done a complete circuit analysis of why it vaporized.
On 2026-01-03 20:43, rbowman wrote:
On Sat, 3 Jan 2026 07:03:38 -0500, Chris Ahlstrom wrote:
I once created an audio playback app with class hierarchies in C, rather >>> than C++. It was an interesting experiment, and it worked. But that's
the last time I tried that.
A class is a glorified struct. I remember heated discussions at one of the >> Boston Computer Society's meeting before 'C++' became a name about 'C with >> Classes' and whether a new language was needed.
'C with Classes' is now a derogatory term that describes the sort of C++ I >> write. Charles Petzhold has written a number of books on programming for
Windows. He has an intense dislike for C++ so if you can track down some
of the first editions of 'Programming Windows' they are all C. The 6th
edition was C# which he said was what should have been all along.
What's the difference between C++ and C#? (I don't know how to pronounce that one).
Le 31-12-2025, Lars Poulsen <lars@beagle-ears.com> a |-crit-a:
On 2025-12-31, c186282 <c186282@nnada.net> wrote:
Can't wait to see what the AIs are cranking out in
a few years ... 29 levels all bunched together into
one gigantic line ? :-)
Shades of APL.
Except that, with APL, from what I can remember, the lines weren't
gigantic. We were able to do pretty impressive stuff with only short
lines. Well, I'm not speaking about the comments needed to explain the
short line...
On Sat, 3 Jan 2026 08:31:33 +0000, The Natural Philosopher wrote:
No Low-Level Access: The language lacked a way to override its
strict type system, making it impossible to write its own I/O systems or
memory allocators *within the language itself*.
The University of Maine used Pascal as a didactic language and most of the engineers at Sprague Electric were from UM. I can't remember the term but
I wrote several dlls, module, or whatever they were called that allowed Pascal to do stuff like gather process data from HP instrumentation,
control robotic arms, and other real world activities. Hey, it was
money...
It is depressing that many companies I either worked for directly or as a hired gun have wiki articles starting with
"Sprague Electric Company was an electronic component maker"
"Sylvania Electric Products Inc. was an East Coast American manufacturer
of electrical and electronic equipment,"
"General Electric Company (GE) was an American multinational conglomerate"
It isn't even the usual X was bought by Y was bought by Z. They're gone completely although the GE trademark does live on in GE Aerospace. Also
gone are all the jobs the companies provided.
On Sat, 3 Jan 2026 07:39:32 -0700, Peter Flass wrote:
We could have convinced them week one, except Biden was too spineless.
When the Russian invasion was pending we pulled all our people out. I
felt that we should have put more people in - not military forces per
se, but "advisors" and "trainers" imbedded with Ukrainian troops at the
front lines.
That worked so swell in Vietnam.
On Sat, 03 Jan 2026 06:09:32 GMT, Charlie Gibbs wrote:
And whether the variable is followed by some padding. If that char[4]
variable is followed by, say, 4 bytes of padding, you can write up to 8
bytes to it and not feel a thing. Then comes the day when you try to
write 9 bytes there and kaboom. I've lost a lot of hair with those
ones,
when a program that's run fine for a couple of years suddenly dies.
I have fixed bugs that were old enough to vote. Like the organisms in the permafrost in the plot lines of 'The Last ship' and 'Fortitude' they lay there in wait...
30 years ago programmers were very stingy with allocations.
On 1/3/26 15:38, rbowman wrote:
On Sat, 03 Jan 2026 06:09:32 GMT, Charlie Gibbs wrote:
And whether the variable is followed by some padding.-a If that char[4]
variable is followed by, say, 4 bytes of padding, you can write up to 8
bytes to it and not feel a thing.-a Then comes the day when you try to
write 9 bytes there and kaboom.-a I've lost a lot of hair with those
ones,
when a program that's run fine for a couple of years suddenly dies.
I have fixed bugs that were old enough to vote. Like the organisms in the
permafrost in the plot lines of 'The Last ship' and 'Fortitude' they lay
there in wait...
30 years ago programmers were very stingy with allocations.
-a Wasn't much to allocate ....-a :-)
According to Lars Poulsen <lars@beagle-ears.com>:
(who was our company's head accountant) complained about some of the
software engineers forgetting to cash their paychecks for up to 6
months at a time.
Dennis Ritchie apparently failed to cash so many paychecks that one time
they voided all the old checks, wrote one big new one, and then had someone walk him over to the bank and be sure he deposited it.
On 1/3/26 13:12, rbowman wrote:
On Sat, 3 Jan 2026 08:31:33 +0000, The Natural Philosopher wrote:
No Low-Level Access: The language lacked a way to override itsThe University of Maine used Pascal as a didactic language and most
strict type system, making it impossible to write its own I/O systems or >>> memory allocators *within the language itself*.
of the
engineers at Sprague Electric were from UM. I can't remember the term but
I wrote several dlls, module, or whatever they were called that allowed
Pascal to do stuff like gather process data from HP instrumentation,
control robotic arms, and other real world activities. Hey, it was
money...
It is depressing that many companies I either worked for directly or
as a
hired gun have wiki articles starting with
"Sprague Electric Company was an electronic component maker"
"Sylvania Electric Products Inc. was an East Coast American
manufacturer
of electrical and electronic equipment,"
"General Electric Company (GE) was an American multinational
conglomerate"
It isn't even the usual X was bought by Y was bought by Z. They're
gone
completely although the GE trademark does live on in GE Aerospace. Also
gone are all the jobs the companies provided.
I thought GE was still going. Besides aerospace, is GE Porwer Systems
still running (turbines, generators, and such)? I lived in the general vicinity of Schenectady for many years and had family that worked
there. I know GE Plastics in Pittsfield and Waterford was sold off (I
had a gig there for a while, GE-400 system). MAO had something to do
with nuclear subs. I know the appliance division left a long time ago,
and I'm not even going to mention the Computer Division here in
Phoenix.
I look at some code and wonder "how the heck has this ever worked?", but
the answer is that no one ever hit that combination of things before, or used that option.
Le 29-12-2025, Niklas Karlsson <nikke.karlsson@gmail.com> a |-crit-a:
On 2025-12-28, Carlos E.R. <robin_listas@es.invalid> wrote:
I have the rot13 program installed, and that means that I have used it
at some point.
tr a-zA-Z n-za-mN-ZA-M
will also do the job.
It's far from convenient inside thunderbird. It's better with slrn. When
I'm reading messages with slrn, [Echap]-[R] is easier, but when I'm
writing them I'm in vim, so it's easy to use. But from within
thunderbird, I'm not that sure.
I don't really agree that C# is easier. You still have to develop
a mental model of the language and master adjunct frameworks like
.NET.
On 2026-01-04, Peter Flass <Peter@Iron-Spring.com> wrote:
I look at some code and wonder "how the heck has this ever worked?", but
the answer is that no one ever hit that combination of things before, or
used that option.
That's certainly the sensible explanation, but I've had scenarios like
that, even with my own code from the past, where I could swear up and
down that I myself had successfully used that code in the exact scenario
that would obviously break.
On 2026-01-03 20:43, rbowman wrote:
On Sat, 3 Jan 2026 07:03:38 -0500, Chris Ahlstrom wrote:
I once created an audio playback app with class hierarchies in C,
rather than C++. It was an interesting experiment, and it worked. But
that's the last time I tried that.
A class is a glorified struct. I remember heated discussions at one of
the Boston Computer Society's meeting before 'C++' became a name about
'C with Classes' and whether a new language was needed.
'C with Classes' is now a derogatory term that describes the sort of
C++ I write. Charles Petzhold has written a number of books on
programming for Windows. He has an intense dislike for C++ so if you
can track down some of the first editions of 'Programming Windows' they
are all C. The 6th edition was C# which he said was what should have
been all along.
What's the difference between C++ and C#? (I don't know how to pronounce
that one).
I don't really agree that C# is easier. You still have to develop a
mental model of the language and master adjunct frameworks like .NET.
On 2026-01-04, Niklas Karlsson <nikke.karlsson@gmail.com> wrote:
On 2026-01-04, Peter Flass <Peter@Iron-Spring.com> wrote:
I look at some code and wonder "how the heck has this ever worked?", but >>> the answer is that no one ever hit that combination of things before, or >>> used that option.
That's certainly the sensible explanation, but I've had scenarios like
that, even with my own code from the past, where I could swear up and
down that I myself had successfully used that code in the exact scenario
that would obviously break.
Yup. Sounds like a Schrodinbug. It should have never worked,
but it does until you look at it - and then it never works again.
rbowman wrote this post by blinking in Morse code:
On Sat, 03 Jan 2026 06:09:32 GMT, Charlie Gibbs wrote:
And whether the variable is followed by some padding. If that char[4]
variable is followed by, say, 4 bytes of padding, you can write up to 8
bytes to it and not feel a thing. Then comes the day when you try to
write 9 bytes there and kaboom. I've lost a lot of hair with those
ones,
when a program that's run fine for a couple of years suddenly dies.
I have fixed bugs that were old enough to vote. Like the organisms in the
permafrost in the plot lines of 'The Last ship' and 'Fortitude' they lay
there in wait...
30 years ago programmers were very stingy with allocations.
I've found bugs in my own code that went unnoticed for years.
That's one good thing about refactoring or revisiting old code for
no reason.
C++ is wayyyyy beyond C w/classes now. Example: templates, promises,
futures, and a greatly expanded Standard Library (e.g.
the <random> functions)..
Does Petzhold still have the Windows tattoo? :-D
I thought GE was still going. Besides aerospace, is GE Porwer Systems
still running (turbines, generators, and such)? I lived in the general vicinity of Schenectady for many years and had family that worked there.
I know GE Plastics in Pittsfield and Waterford was sold off (I had a gig there for a while, GE-400 system). MAO had something to do with nuclear
subs. I know the appliance division left a long time ago, and I'm not
even going to mention the Computer Division here in Phoenix.
Except that, with APL, from what I can remember, the lines weren't
gigantic. We were able to do pretty impressive stuff with only short
lines. Well, I'm not speaking about the comments needed to explain
the short line...
GE sold off their mainframe computer business (GE-600 series) sometime
in the early 1970s ...
What's the difference between C++ and C#? (I don't know how to
pronounce that one).
Does C# qualify as a Microsoft proprietary language? Or are there implementations on OSes other than Windows (and compilers, either
open source or available from other vendors)?
On 1/3/26 14:58, c186282 wrote:
On 1/3/26 15:38, rbowman wrote:
On Sat, 03 Jan 2026 06:09:32 GMT, Charlie Gibbs wrote:
And whether the variable is followed by some padding.-a If that char[4] >>>> variable is followed by, say, 4 bytes of padding, you can write up to 8 >>>> bytes to it and not feel a thing.-a Then comes the day when you try to >>>> write 9 bytes there and kaboom.-a I've lost a lot of hair with those
ones,
when a program that's run fine for a couple of years suddenly dies.
I have fixed bugs that were old enough to vote. Like the organisms in
the
permafrost in the plot lines of 'The Last ship' and 'Fortitude' they lay >>> there in wait...
30 years ago programmers were very stingy with allocations.
-a-a Wasn't much to allocate ....-a :-)
What is this "allocate" thing. When I started the major languages were
COBOL and FORTRAN, and both used only static memory allocation.
On Sun, 04 Jan 2026 19:41:11 GMT, Charlie Gibbs wrote:
Does C# qualify as a Microsoft proprietary language? Or are there
implementations on OSes other than Windows (and compilers, either
open source or available from other vendors)?
The only implementation IrCOm aware of is MicrosoftrCOs one built on top
of Dotnet.
Dotnet itself is supposedly open-source and portable to some degree
now. There are reports of it running on Linux.
On 2026-01-04, Chris Ahlstrom <OFeem1987@teleworm.us> wrote:
I don't really agree that C# is easier. You still have to develop a
mental model of the language and master adjunct frameworks like .NET.
Does C# qualify as a Microsoft proprietary language?
Or are there implementations on OSes other than Windows (and compilers, either open source or available from other vendors)?
On Sun, 04 Jan 2026 19:41:11 GMT, Charlie Gibbs wrote:
Does C# qualify as a Microsoft proprietary language? Or are there
implementations on OSes other than Windows (and compilers, either open
source or available from other vendors)?
The only implementation IrCOm aware of is MicrosoftrCOs one built on top of Dotnet.
Dotnet itself is supposedly open-source and portable to some degree now. There are reports of it running on Linux.
On 1/4/26 16:18, Lawrence DrCOOliveiro wrote:
On Sun, 04 Jan 2026 19:41:11 GMT, Charlie Gibbs wrote:
Does C# qualify as a Microsoft proprietary language? Or are there
implementations on OSes other than Windows (and compilers, either open
source or available from other vendors)?
The only implementation IrCOm aware of is MicrosoftrCOs one built on top of >> Dotnet.
Dotnet itself is supposedly open-source and portable to some degree
now. There are reports of it running on Linux.
That would be very end-around ...
Maybe just to forget C# ... CPP is good enough.
Actually, don't even like CPP ... plain 'C' has so far met all my
needs.
On Sun, 4 Jan 2026 15:14:30 +0100, Carlos E.R. wrote:
What's the difference between C++ and C#? (I don't know how to
pronounce that one).
ItrCOs spelled rCLC#rCY, but itrCOs pronounced rCLCrO>rCY.
Anyway, look into micro-controllers ... often VERY little RAM. You
have to be VERY stingy and clever.
"Think I'll make a 2K buffer just in case ..." the thing might not
HAVE 2K of memory. If programming in ASM then YOU have to do the nuts
and bolts of re-allocating space IF there's enough remaining.
On 2026-01-04, Niklas Karlsson <nikke.karlsson@gmail.com> wrote:
On 2026-01-04, Peter Flass <Peter@Iron-Spring.com> wrote:
I look at some code and wonder "how the heck has this ever worked?",
but the answer is that no one ever hit that combination of things
before, or used that option.
That's certainly the sensible explanation, but I've had scenarios like
that, even with my own code from the past, where I could swear up and
down that I myself had successfully used that code in the exact
scenario that would obviously break.
Yup. Sounds like a Schrodinbug. It should have never worked, but it
does until you look at it - and then it never works again.
On Sun, 04 Jan 2026 19:41:12 GMT, Charlie Gibbs wrote:
On 2026-01-04, Niklas Karlsson <nikke.karlsson@gmail.com> wrote:
On 2026-01-04, Peter Flass <Peter@Iron-Spring.com> wrote:
I look at some code and wonder "how the heck has this ever worked?",
but the answer is that no one ever hit that combination of things
before, or used that option.
That's certainly the sensible explanation, but I've had scenarios like
that, even with my own code from the past, where I could swear up and
down that I myself had successfully used that code in the exact
scenario that would obviously break.
Yup. Sounds like a Schrodinbug. It should have never worked, but it
does until you look at it - and then it never works again.
Conversely, it fails until you log a debug statement to see what's going
on and it works. I'd never, never just leave the debug in place, no
siree.
On 1/4/26 16:18, Lawrence DrCOOliveiro wrote:
On Sun, 04 Jan 2026 19:41:11 GMT, Charlie Gibbs wrote:
Does C# qualify as a Microsoft proprietary language? Or are there
implementations on OSes other than Windows (and compilers, either
open source or available from other vendors)?
The only implementation IrCOm aware of is MicrosoftrCOs one built on top
of Dotnet.
Dotnet itself is supposedly open-source and portable to some degree
now. There are reports of it running on Linux.
That would be very end-around ...
Maybe just to forget C# ... CPP is good enough.
Actually, don't even like CPP ... plain 'C' has
so far met all my needs.
Lawrence DrCOOliveiro wrote:
Charlie Gibbs wrote:
Does C# qualify as a Microsoft proprietary language? Or are there
implementations on OSes other than Windows (and compilers, either open
source or available from other vendors)?
The only implementation IrCOm aware of is MicrosoftrCOs one built on top
of Dotnet.
Dotnet itself is supposedly open-source and portable to some degree
now. There are reports of it running on Linux.
It definitely runs on Linux and is easily installed with dnf or apt. GUIs have been problematic but console and ASP.NET workks fine.
https://dotnet.microsoft.com/en-us/apps/aspnet
This isn't Ballmer's Microsoft.
plain 'C' has -a so far met all my needs.
The Pico SDK documentation talks about C/C++ but I
haven't seen C++ being used much in the examples.
Anyway C++ is handy in small doses.
On Sun, 04 Jan 2026 19:41:11 GMT, Charlie Gibbs wrote:
Does C# qualify as a Microsoft proprietary language? Or are there
implementations on OSes other than Windows (and compilers, either
open source or available from other vendors)?
The only implementation IrCOm aware of is MicrosoftrCOs one built on top
of Dotnet.
Dotnet itself is supposedly open-source and portable to some degree
now. There are reports of it running on Linux.
On Sun, 4 Jan 2026 19:04:14 -0500, c186282 wrote:
<brevsnip>
Actually, don't even like CPP ... plain 'C' has so far met all my
needs.
I was thinking about C++ on my walk today. Arduino sketches and other MCU SDKs refer to C/C++.
#include <nRF24L01.h>
#include <RF24.h>
#define led 12
RF24 radio(7, 8);
const byte addresses[][6] = {"00001", "00002"};
int angleValue = 0;
boolean buttonState = 0;
<snip>
Anyway C++ is handy in small doses.
On 05/01/2026 03:55, rbowman wrote:
The Pico SDK documentation talks about C/C++ but II tend to avoid the examples where it is being used.
haven't seen C++ being used much in the examples.
Anyway C++ is handy in small doses.
I've not found it at all necessary, ever.
On 1/3/26 21:21, John Levine wrote:
According to Lars Poulsen <lars@beagle-ears.com>:
(who was our company's head accountant) complained about some of the
software engineers forgetting to cash their paychecks for up to 6
months at a time.
Dennis Ritchie apparently failed to cash so many paychecks that one time
they voided all the old checks, wrote one big new one, and then had someone >> walk him over to the bank and be sure he deposited it.
LOL!
On Sun, 4 Jan 2026 21:17:05 -0000 (UTC), Lawrence DrCOOliveiro wrote:
On Sun, 4 Jan 2026 15:14:30 +0100, Carlos E.R. wrote:
What's the difference between C++ and C#? (I don't know how to
pronounce that one).
ItrCOs spelled rCLC#rCY, but itrCOs pronounced rCLCrO>rCY.
For once sanity prevailed and they didn't use a character that would have been a pain in the ass evermore.
Carlos E.R. wrote this post by blinking in Morse code:
On 2026-01-03 20:43, rbowman wrote:
On Sat, 3 Jan 2026 07:03:38 -0500, Chris Ahlstrom wrote:
I once created an audio playback app with class hierarchies in C, rather >>>> than C++. It was an interesting experiment, and it worked. But that's
the last time I tried that.
A class is a glorified struct. I remember heated discussions at one of the >>> Boston Computer Society's meeting before 'C++' became a name about 'C with >>> Classes' and whether a new language was needed.
'C with Classes' is now a derogatory term that describes the sort of C++ I >>> write. Charles Petzhold has written a number of books on programming for >>> Windows. He has an intense dislike for C++ so if you can track down some >>> of the first editions of 'Programming Windows' they are all C. The 6th
edition was C# which he said was what should have been all along.
What's the difference between C++ and C#? (I don't know how to pronounce
that one).
C-sharp. (Get it? Get it?)
AI Overview
C++ and C# are both derived from the C language family but
target different programming needs:
C++ offers high performance and low-level hardware control,
making it ideal for systems programming and game engines,
while C# provides a managed, higher-level environment for
easier and faster development of web, desktop, and mobile
applications
Kind of analogous to C++ versus Java.
I don't really agree that C# is easier. You still have to develop
a mental model of the language and master adjunct frameworks like
.NET.
On Sun, 04 Jan 2026 19:41:11 GMT, Charlie Gibbs wrote:
On 2026-01-04, Chris Ahlstrom <OFeem1987@teleworm.us> wrote:
I don't really agree that C# is easier. You still have to develop a
mental model of the language and master adjunct frameworks like .NET.
Does C# qualify as a Microsoft proprietary language?
Or are there implementations on OSes other than Windows (and compilers,
either open source or available from other vendors)?
https://www.mono-project.com/
Sort of... Using the dotnet sdk on Windows or Linux is sort of like
using venv in Python or the express generator with node/express.
dotnet new console -n world
creates a new console application in the 'world' directory with the
director structure and a very minimalist Program.cs with
Console.WriteLine("Hello, World!");
dotnet build followed by
$ dotnet run
Hello, World!
works. Packages are added with Nuget, which is like npm or pip.
https://www.nuget.org/
It's free and open source but the entire ecosystem uses the .NET
terminology. I doubt anyone who has no experience developing on Windows
is going to pick C#. It's not that different from Java except it was initially a Windows only language and not sold as cross platform, run anywhere, from the beginning.
On Sun, 4 Jan 2026 15:14:30 +0100, Carlos E.R. wrote:
On 2026-01-03 20:43, rbowman wrote:
On Sat, 3 Jan 2026 07:03:38 -0500, Chris Ahlstrom wrote:
I once created an audio playback app with class hierarchies in C,
rather than C++. It was an interesting experiment, and it worked. But
that's the last time I tried that.
A class is a glorified struct. I remember heated discussions at one of
the Boston Computer Society's meeting before 'C++' became a name about
'C with Classes' and whether a new language was needed.
'C with Classes' is now a derogatory term that describes the sort of
C++ I write. Charles Petzhold has written a number of books on
programming for Windows. He has an intense dislike for C++ so if you
can track down some of the first editions of 'Programming Windows' they
are all C. The 6th edition was C# which he said was what should have
been all along.
What's the difference between C++ and C#? (I don't know how to pronounce
that one).
C Sharp. In the late '90s Microsoft released Visual J++, their implementation of Java. I still have the media with an IDE similar to
Visual Studio. It was quite nice but did not meet Sun's purity test so Sun sued Microsoft.
C# was released in the early 2000s, with Hejlsberg as the principal
designer.
https://en.wikipedia.org/wiki/Anders_Hejlsberg
He'd also developed J++ so C# incorporated the lessons learned from that
as well as C++. I don't really like C++ and find C# a lot better for
Windows programming. Mono was an early attempt to make it cross platform
and is still around. The alternative is to install the .NET SDK.
https://learn.microsoft.com/en-us/dotnet/core/install/linux
That includes the csc compiler:
$ csc
Microsoft (R) Visual C# Compiler version 3.9.0-6.21124.20 (db94f4cc) Copyright (C) Microsoft Corporation. All rights reserved.
On Linux the ability to build GUIs has been problematic. There i a Gtk# library but I've never used it.
https://www.mono-project.com/docs/gui/gtksharp/
You can do both console and ASP .NET backend apps. For kicks, I did a
command line app to download information from the iTunes database in
Python and C#. The syntax differs of course but the complexity is very similar compared to doing it in C or C++.
Since csc emits an IL that depends on the framework runtime by passing
flags you can build Linux packages on Windows and vice versa. You can also target ARM devices.
https://learn.microsoft.com/en-us/dotnet/iot/deployment
MS managed to create more confusion that normal. .NET Framework was the standard runtime on Windows boxes. The .NET Core project was aimed at
cross platform solutions and had its own numbering so .NET Core 3.x was contemporaneous with .NET Framework 4.7x. At that point they decided Core was the future so .NET 5.0 was .NET Core with .NET Framework 4.8 being
the last of what everone called .NET. .NET 10 is the current release.
On Sun, 4 Jan 2026 19:04:14 -0500, c186282 wrote:
On 1/4/26 16:18, Lawrence DrCOOliveiro wrote:
On Sun, 04 Jan 2026 19:41:11 GMT, Charlie Gibbs wrote:
Does C# qualify as a Microsoft proprietary language? Or are there
implementations on OSes other than Windows (and compilers, either open >>>> source or available from other vendors)?
The only implementation IrCOm aware of is MicrosoftrCOs one built on top of >>> Dotnet.
Dotnet itself is supposedly open-source and portable to some degree
now. There are reports of it running on Linux.
That would be very end-around ...
Maybe just to forget C# ... CPP is good enough.
Actually, don't even like CPP ... plain 'C' has so far met all my
needs.
I was thinking about C++ on my walk today. Arduino sketches and other MCU SDKs refer to C/C++.
#include <nRF24L01.h>
#include <RF24.h>
#define led 12
RF24 radio(7, 8);
const byte addresses[][6] = {"00001", "00002"};
int angleValue = 0;
boolean buttonState = 0;
void setup() {
Serial.begin(9600);
radio.begin();
radio.openWritingPipe(addresses[1]);
radio.openReadingPipe(1, addresses[0]);
radio.setPALevel(RF24_PA_MIN);
}
Obviously when you instantiate the RF24 objects and start calling class methods that will use parameters passed in to the constructor, you're in
C++ land. However for the most part it's 'C with Classes' and very seldom
has to get into the C++ esoterica.
Arduino simplifies it by magically creating setup() and loop() without
the boilerplate. The Pico SDK documentation talks about C/C++ but I
haven't seen C++ being used much in the examples.
Anyway C++ is handy in small doses.
On 2026-01-04 15:43, Chris Ahlstrom wrote:
Carlos E.R. wrote this post by blinking in Morse code:
What's the difference between C++ and C#? (I don't know how to pronounce >>> that one).
C-sharp. (Get it? Get it?)
Mmm... no, I don't think I get it. Maybe something cultural in it.
What's the difference between C++ and C#? (I don't know how to
pronounce that one).
C-sharp. (Get it? Get it?)
I've been getting used to using the universal initializer (from C++11):
const byte addresses[][6] { "00001", "00002" };
int angleValue { 0 };
boolean buttonState { 0 };
On 2026-01-05, c186282 <c186282@nnada.net> wrote:
On 1/4/26 16:18, Lawrence DrCOOliveiro wrote:
On Sun, 04 Jan 2026 19:41:11 GMT, Charlie Gibbs wrote:
Does C# qualify as a Microsoft proprietary language? Or are there
implementations on OSes other than Windows (and compilers, either
open source or available from other vendors)?
The only implementation IrCOm aware of is MicrosoftrCOs one built on top >>> of Dotnet.
Dotnet itself is supposedly open-source and portable to some degree
now. There are reports of it running on Linux.
That would be very end-around ...
Maybe just to forget C# ... CPP is good enough.
Actually, don't even like CPP ... plain 'C' has so far met all my
needs.
Ditto - and I'm too old to change now. I'd rather spend what little
time I have on having fun and maintaining my existing code base
(including comprehensive home-brewed C libraries) rather than going
through the software equivalent of moving to a foreign country.
You can really blame Microsoft for creating MFC; they had to wrap their C API in something. You can blame them for DDE/OLE/COM, the ATl, and
adopting Hungarian notation.
On the flip you can get a Pico with wi-fi already
built in, perhaps more useful than Rf24 unless you want yer Ards to
be like a very 'private network'.
There are some other serial->RF transceivers out there too which work
down in the megahertz zone and may offer more range.
Anyway, Ard 'C' is pretty straight 'C'. Some of the libs CAN have an
'object' character however, but I don't see that as an entire
paradigm shift to CPP.
The C changes over the years like being able to declare variables where
they are first used and single line comments were something I greeted with "Hell yeah!" rather than "What CS PhD dreamed this crap up?"
On Sun, 4 Jan 2026 09:43:40 -0500
Chris Ahlstrom <OFeem1987@teleworm.us> wrote:
What's the difference between C++ and C#? (I don't know how to
pronounce that one).
C-sharp. (Get it? Get it?)
The impish might prefer "C-hash" ;P
On Sun, 4 Jan 2026 09:43:40 -0500
Chris Ahlstrom <OFeem1987@teleworm.us> wrote:
What's the difference between C++ and C#? (I don't know how to
pronounce that one).
C-sharp. (Get it? Get it?)
The impish might prefer "C-hash" ;P
C#/dotnet ... I'm uncomfortable seeing such
heavily M$ solutions used in Linux. I still
have this vision of M$ lawyers waiting until
enough Linux stuff is writ using their stuff
and then POUNCING.
On Sun, 4 Jan 2026 09:43:40 -0500
Chris Ahlstrom <OFeem1987@teleworm.us> wrote:
What's the difference between C++ and C#? (I don't know how to
pronounce that one).
C-sharp. (Get it? Get it?)
The impish might prefer "C-hash" ;P
On 05/01/2026 17:48, rbowman wrote:
The C changes over the years like being able to declare variables where
they are first used and single line comments were something I greeted with >> "Hell yeah!" rather than "What CS PhD dreamed this crap up?"
Spot on.
C is enough to do the job and simple to learn. Why complicate shit?
On 2026-01-05, The Natural Philosopher <tnp@invalid.invalid> wrote:
On 05/01/2026 17:48, rbowman wrote:
The C changes over the years like being able to declare variables where
they are first used and single line comments were something I greeted with >>> "Hell yeah!" rather than "What CS PhD dreamed this crap up?"
Spot on.
C is enough to do the job and simple to learn. Why complicate shit?
Let me guess: so companies can sell you a new compiler every year,
plus courses in how to use the new shit.
"The statement "Pascal has no I/O" originates fromYeah, that was it - not *no* I/O in the sense that was true of Algol,
Brian KernighanrCOs 1981 essay, "Why Pascal is Not My Favorite
Programming Language".
Kernighan argued that the original 1970 definition of Pascal was
severely limited for systems programming because:
I once created an audio playback app with class hierarchies in C,
rather than C++. It was an interesting experiment, and it worked. But
that's the last time I tried that.
https://en.wikipedia.org/wiki/Mono_(software) originated as an
independent implementation of (some of) .Net and C#, but apparently now
has some MS code (.Net Core) in the runtime.
On 2026-01-05, rbowman <bowman@montana.com> wrote:
On Sun, 04 Jan 2026 19:41:12 GMT, Charlie Gibbs wrote:
On 2026-01-04, Niklas Karlsson <nikke.karlsson@gmail.com> wrote:
On 2026-01-04, Peter Flass <Peter@Iron-Spring.com> wrote:
I look at some code and wonder "how the heck has this ever worked?", >>>>> but the answer is that no one ever hit that combination of things
before, or used that option.
That's certainly the sensible explanation, but I've had scenarios
like that, even with my own code from the past, where I could swear
up and down that I myself had successfully used that code in the
exact scenario that would obviously break.
Yup. Sounds like a Schrodinbug. It should have never worked, but it
does until you look at it - and then it never works again.
Conversely, it fails until you log a debug statement to see what's
going on and it works. I'd never, never just leave the debug in place,
no siree.
Unless the customer is screaming for a fix RIGHT NOW.
But I'd go back and try to track it down once he's pacified.
"The statement "Pascal has no I/O" originates from
Brian KernighanrCOs 1981 essay, "Why Pascal is Not My Favorite Programming Language".
Kernighan argued that the original 1970 definition of Pascal was
severely limited for systems programming because:
-a-a-a No Low-Level Access: The language lacked a way to override its
strict type system, making it impossible to write its own I/O systems or memory allocators *within the language itself*.
-a-a-a Fixed Array Sizes: Because array size was part of the type, a function could not be written to handle strings or arrays of different lengths, complicating general-purpose file I/O.
-a-a-a Lack of Portability: Standard PascalrCOs I/O was considered "primitive," and any real-world use required implementation-specific extensions that broke portability between compilers."
Actually, many systems programming languages have no I/O, the idea
being that non-OS programs call the OS to do the I/O, and the OS
interacts directly with the hardware.
On 2026-01-05, rbowman <bowman@montana.com> wrote:
You can really blame Microsoft for creating MFC; they had to wrap their
C API in something. You can blame them for DDE/OLE/COM, the ATl, and
adopting Hungarian notation.
(I assume you meant "can't really blame" at the beginning there?)
Time to dip into the quotes file again:
Hungarian Notation is the tactical nuclear weapon of source code
obfuscation techniques.
-- Roedy Green
On 1/3/26 01:31, The Natural Philosopher wrote:
"The statement "Pascal has no I/O" originates from
Brian KernighanrCOs 1981 essay, "Why Pascal is Not My Favorite Programming >> Language".
Kernighan argued that the original 1970 definition of Pascal was
severely limited for systems programming because:
-a-a-a No Low-Level Access: The language lacked a way to override its
strict type system, making it impossible to write its own I/O systems or
memory allocators *within the language itself*.
-a-a-a Fixed Array Sizes: Because array size was part of the type, a
function could not be written to handle strings or arrays of different
lengths, complicating general-purpose file I/O.
-a-a-a Lack of Portability: Standard PascalrCOs I/O was considered
"primitive," and any real-world use required implementation-specific
extensions that broke portability between compilers."
Actually, many systems programming languages have no I/O, the idea being >that non-OS programs call the OS to do the I/O, and the OS interacts >directly with the hardware.
Actually, many systems programming languages have no I/O, the idea
being that non-OS programs call the OS to do the I/O, and the OS
interacts directly with the hardware.
"Systems programming" usually implies implementation of an OS, though,
and IIRC that was the sense that Kernighan was using.
On Mon, 5 Jan 2026 12:33:53 -0700
Peter Flass <Peter@Iron-Spring.com> wrote:
Actually, many systems programming languages have no I/O, the idea
being that non-OS programs call the OS to do the I/O, and the OS
interacts directly with the hardware.
"Systems programming" usually implies implementation of an OS, though,
and IIRC that was the sense that Kernighan was using. You can't excuse limitations by "oh, the OS handles that" when your program *is* the OS.*
Let me guess: so companies can sell you a new compiler every year,
plus courses in how to use the new shit.
Carlos E.R. wrote this post by blinking in Morse code:
What's the difference between C++ and C#? (I don't know how to
pronounce that one).
C-sharp. (Get it? Get it?)
Mmm... no, I don't think I get it. Maybe something cultural in it.
C-pound ...
On 2026-01-05, rbowman <bowman@montana.com> wrote:
You can really blame Microsoft for creating MFC; they had to wrap their C >> API in something. You can blame them for DDE/OLE/COM, the ATl, and
adopting Hungarian notation.
(I assume you meant "can't really blame" at the beginning there?)
Time to dip into the quotes file again:
Hungarian Notation is the tactical nuclear weapon of source code
obfuscation techniques.
-- Roedy Green
On Mon, 05 Jan 2026 05:57:23 GMT, Charlie Gibbs wrote:
<snip>
It's been a day or three but I think I did. iirc it also had the charming feature of only manifesting in the Windows build, not in Linux where I had valgrind and electric fence.
Another mystery is why memory debuggers on Windows are expensive and
barely usable. We had a Purify license but configuring the instrumentation was such a hassle it was rarely used. When the license came up for renewal nobody spoke up to keep it. BoundsChecker reportedly is even worse.
On 5 Jan 2026 18:10:08 GMT, Niklas Karlsson wrote:
C-pound ...
rCLC-urCY?
On 2026-01-05, Lawrence DrCOOliveiro <ldo@nz.invalid> wrote:
On 5 Jan 2026 18:10:08 GMT, Niklas Karlsson wrote:
C-pound ...
rCLC-urCY?
# is often spoken as "pound" in the USA. Notably when instructing
someone to enter things on a phone keypad.
On 5 Jan 2026 23:28:43 GMT, Niklas Karlsson wrote:
On 2026-01-05, Lawrence DrCOOliveiro <ldo@nz.invalid> wrote:
On 5 Jan 2026 18:10:08 GMT, Niklas Karlsson wrote:
C-pound ...
rCLC-urCY?
# is often spoken as "pound" in the USA. Notably when instructing
someone to enter things on a phone keypad.
I have no idea why.
On 2026-01-05, The Natural Philosopher <tnp@invalid.invalid> wrote:
On 05/01/2026 17:48, rbowman wrote:
The C changes over the years like being able to declare variables where
they are first used and single line comments were something I greeted with >>> "Hell yeah!" rather than "What CS PhD dreamed this crap up?"
Spot on.
C is enough to do the job and simple to learn. Why complicate shit?
Let me guess: so companies can sell you a new compiler every year,
plus courses in how to use the new shit.
On 2026-01-06, Lawrence DrCOOliveiro <ldo@nz.invalid> wrote:
On 5 Jan 2026 23:28:43 GMT, Niklas Karlsson wrote:
On 2026-01-05, Lawrence DrCOOliveiro <ldo@nz.invalid> wrote:
On 5 Jan 2026 18:10:08 GMT, Niklas Karlsson wrote:
C-pound ...
rCLC-urCY?
# is often spoken as "pound" in the USA. Notably when instructing
someone to enter things on a phone keypad.
I have no idea why.
In days of yore, "#" was often used by dealers in bulk products as
an abbreviation for pounds weight. For instance, a sack of chicken
feed might consist of "50# laying mash".
On Sat, 3 Jan 2026 08:31:33 +0000
The Natural Philosopher <tnp@invalid.invalid> wrote:
"The statement "Pascal has no I/O" originates from
Brian KernighanrCOs 1981 essay, "Why Pascal is Not My Favorite
Programming Language".
Kernighan argued that the original 1970 definition of Pascal was
severely limited for systems programming because:
Yeah, that was it - not *no* I/O in the sense that was true of Algol,
but weird and constrained in ways that betray its origins as a teaching language. Mainly, files are assumed to be of a uniform structure; you
can have a FILE OF CHAR or a FILE OF INTEGER, but not a file containing
both strings and integers. If you want to do *that,* you're supposed to
make a struct and have a FILE OF that, but this too has to be the same
across the whole thing. Files of mixed or variable structure? Who uses *those!?*
Like many of Wirth's design choices, it sounds simple on paper but is unnecessarily confining in the Real World - and, as Kernighan points
out, there were no "escape hatches" for extending the language from
within, leading to a bunch of proprietary and mutually-incompatible
variants. Obviously, it's been decades and the landscape has changed substantially, but it really was dunderheaded at the time.
On Mon, 5 Jan 2026 12:33:53 -0700
Peter Flass <Peter@Iron-Spring.com> wrote:
Actually, many systems programming languages have no I/O, the idea
being that non-OS programs call the OS to do the I/O, and the OS
interacts directly with the hardware.
"Systems programming" usually implies implementation of an OS, though,
and IIRC that was the sense that Kernighan was using. You can't excuse limitations by "oh, the OS handles that" when your program *is* the OS.*
* (Obviously, there's a certain point in any HLL where Deep Magic has
to handle interfacing between language constructs and bare metal, but
the higher up the "threshold of minimum abstraction" is, the less
suitable it is for systems programming in the first place.
Of course, there's also the problem where seemingly *any* language
that's not designed for systems programming will ultimately get
pressed into service for systems programming *somewhere...*)
On Mon, 5 Jan 2026 11:50:58 -0800, John Ames wrote:
On Mon, 5 Jan 2026 12:33:53 -0700
Peter Flass <Peter@Iron-Spring.com> wrote:
Actually, many systems programming languages have no I/O, the idea
being that non-OS programs call the OS to do the I/O, and the OS
interacts directly with the hardware.
"Systems programming" usually implies implementation of an OS, though,
and IIRC that was the sense that Kernighan was using. You can't excuse
limitations by "oh, the OS handles that" when your program *is* the OS.*
ThatrCOs precisely the point that Peter Flass was trying to make: the
lack of built-in I/O features in a language designed to implement
operating systems isnrCOt a bug, itrCOs a feature.
On 1/5/26 13:49, John Ames wrote:
On Sat, 3 Jan 2026 08:31:33 +0000
The Natural Philosopher <tnp@invalid.invalid> wrote:
"The statement "Pascal has no I/O" originates from
Brian KernighanrCOs 1981 essay, "Why Pascal is Not My Favorite
Programming Language".
Kernighan argued that the original 1970 definition of Pascal was
severely limited for systems programming because:
Yeah, that was it - not *no* I/O in the sense that was true of Algol,
but weird and constrained in ways that betray its origins as a teaching
language. Mainly, files are assumed to be of a uniform structure; you
can have a FILE OF CHAR or a FILE OF INTEGER, but not a file containing
both strings and integers. If you want to do *that,* you're supposed to
make a struct and have a FILE OF that, but this too has to be the same
across the whole thing. Files of mixed or variable structure? Who uses
*those!?*
Like many of Wirth's design choices, it sounds simple on paper but is
unnecessarily confining in the Real World - and, as Kernighan points
out, there were no "escape hatches" for extending the language from
within, leading to a bunch of proprietary and mutually-incompatible
variants. Obviously, it's been decades and the landscape has changed
substantially, but it really was dunderheaded at the time.
-a Wirth was an 'academic' - and Pascal/M2/M3 kind
-a of reflect that.
-a However it WAS easy to extend the language - add in
-a those Real World necessities. By the time Turbo Pascal
-a hit the scene there really wasn't anything you could
-a not do with Pascal.
-a And I still write in Pascal fairly often - like
-a it better than 'C'.
On 1/5/26 12:50, John Ames wrote:
On Mon, 5 Jan 2026 12:33:53 -0700
Peter Flass <Peter@Iron-Spring.com> wrote:
Actually, many systems programming languages have no I/O, the idea
being that non-OS programs call the OS to do the I/O, and the OS
interacts directly with the hardware.
"Systems programming" usually implies implementation of an OS, though,
and IIRC that was the sense that Kernighan was using. You can't excuse
limitations by "oh, the OS handles that" when your program *is* the OS.*
* (Obviously, there's a certain point in any HLL where Deep Magic has
-a-a to handle interfacing between language constructs and bare metal, but >> -a-a the higher up the "threshold of minimum abstraction" is, the less
-a-a suitable it is for systems programming in the first place.
-a-a Of course, there's also the problem where seemingly *any* language
-a-a that's not designed for systems programming will ultimately get
-a-a pressed into service for systems programming-a *somewhere...*)
I seem to recall reading that someone once wrote an OS in COBOL.
On 1/5/26 14:42, Lawrence DrCOOliveiro wrote:
On Mon, 5 Jan 2026 11:50:58 -0800, John Ames wrote:
On Mon, 5 Jan 2026 12:33:53 -0700
Peter Flass <Peter@Iron-Spring.com> wrote:
Actually, many systems programming languages have no I/O, the idea
being that non-OS programs call the OS to do the I/O, and the OS
interacts directly with the hardware.
"Systems programming" usually implies implementation of an OS, though,
and IIRC that was the sense that Kernighan was using. You can't excuse
limitations by "oh, the OS handles that" when your program *is* the OS.*
ThatrCOs precisely the point that Peter Flass was trying to make: the
lack of built-in I/O features in a language designed to implement
operating systems isnrCOt a bug, itrCOs a feature.
The I/O package is probably a huge part of any program that uses it.
printf, for example, needs to support the conversion of all possible
data types to character for output.
I seem to recall reading that someone once wrote an OS in COBOL.
I'm not sure to what extent there was an attempt early on to
standardize the extensions, but this would have helped adoption of
the language immensely.
On 1/5/26 17:57, c186282 wrote:
On 1/5/26 13:49, John Ames wrote:
On Sat, 3 Jan 2026 08:31:33 +0000
The Natural Philosopher <tnp@invalid.invalid> wrote:
"The statement "Pascal has no I/O" originates from
Brian KernighanrCOs 1981 essay, "Why Pascal is Not My Favorite
Programming Language".
Kernighan argued that the original 1970 definition of Pascal was
severely limited for systems programming because:
Yeah, that was it - not *no* I/O in the sense that was true of Algol,
but weird and constrained in ways that betray its origins as a teaching
language. Mainly, files are assumed to be of a uniform structure; you
can have a FILE OF CHAR or a FILE OF INTEGER, but not a file containing
both strings and integers. If you want to do *that,* you're supposed to
make a struct and have a FILE OF that, but this too has to be the same
across the whole thing. Files of mixed or variable structure? Who uses
*those!?*
Like many of Wirth's design choices, it sounds simple on paper but is
unnecessarily confining in the Real World - and, as Kernighan points
out, there were no "escape hatches" for extending the language from
within, leading to a bunch of proprietary and mutually-incompatible
variants. Obviously, it's been decades and the landscape has changed
substantially, but it really was dunderheaded at the time.
-a-a Wirth was an 'academic' - and Pascal/M2/M3 kind
-a-a of reflect that.
-a-a However it WAS easy to extend the language - add in
-a-a those Real World necessities. By the time Turbo Pascal
-a-a hit the scene there really wasn't anything you could
-a-a not do with Pascal.
-a-a And I still write in Pascal fairly often - like
-a-a it better than 'C'.
I'm not sure to what extent there was an attempt early on to standardize
the extensions, but this would have helped adoption of the language immensely.
On Mon, 5 Jan 2026 20:37:59 -0700, Peter Flass wrote:
I'm not sure to what extent there was an attempt early on to
standardize the extensions, but this would have helped adoption of
the language immensely.
Some degree of UCSD Pascal compatibility was very common among microcomputer-based implementations.
Outside of that ... well, there was ISO 10206.
On Tue, 06 Jan 2026 00:23:56 GMT, Charlie Gibbs wrote:
On 2026-01-06, Lawrence DrCOOliveiro <ldo@nz.invalid> wrote:
On 5 Jan 2026 23:28:43 GMT, Niklas Karlsson wrote:
On 2026-01-05, Lawrence DrCOOliveiro <ldo@nz.invalid> wrote:
On 5 Jan 2026 18:10:08 GMT, Niklas Karlsson wrote:
C-pound ...
rCLC-urCY?
# is often spoken as "pound" in the USA. Notably when instructing
someone to enter things on a phone keypad.
I have no idea why.
In days of yore, "#" was often used by dealers in bulk products as an
abbreviation for pounds weight. For instance, a sack of chicken feed
might consist of "50# laying mash".
Most of us used rCLlbrCY.
Some degree of UCSD Pascal compatibility was very common among microcomputer-based implementations.
rbowman wrote this post by blinking in Morse code:
On Mon, 05 Jan 2026 05:57:23 GMT, Charlie Gibbs wrote:
<snip>
It's been a day or three but I think I did. iirc it also had the
charming feature of only manifesting in the Windows build, not in Linux
where I had valgrind and electric fence.
Another mystery is why memory debuggers on Windows are expensive and
barely usable. We had a Purify license but configuring the
instrumentation was such a hassle it was rarely used. When the license
came up for renewal nobody spoke up to keep it. BoundsChecker
reportedly is even worse.
At work I was using a free product that was a lot like valgrind:
Dr. Memory.
<https://drmemory.org/>
Niklas Karlsson wrote this post by blinking in Morse code:
On 2026-01-05, rbowman <bowman@montana.com> wrote:
You can really blame Microsoft for creating MFC; they had to wrap their C >>> API in something. You can blame them for DDE/OLE/COM, the ATl, and
adopting Hungarian notation.
(I assume you meant "can't really blame" at the beginning there?)
Time to dip into the quotes file again:
Hungarian Notation is the tactical nuclear weapon of source code
obfuscation techniques.
-- Roedy Green
Agreed. The only warts I use are:
m_ A class member. But I often make an accessor function
without the "m_"
sm_ A static class member.
c_ A constant, not in a class..
Lawrence DrCOOliveiro wrote this post by blinking in Morse code:
On Sun, 04 Jan 2026 19:41:11 GMT, Charlie Gibbs wrote:
Does C# qualify as a Microsoft proprietary language? Or are there
implementations on OSes other than Windows (and compilers, either
open source or available from other vendors)?
The only implementation IrCOm aware of is MicrosoftrCOs one built on top
of Dotnet.
Dotnet itself is supposedly open-source and portable to some degree
now. There are reports of it running on Linux.
<https://learn.microsoft.com/en-us/dotnet/core/install/linux>
Install .NET on Linux
Also:
<https://github.com/mono/mono>
On 1/5/26 12:50, John Ames wrote:
On Mon, 5 Jan 2026 12:33:53 -0700
Peter Flass <Peter@Iron-Spring.com> wrote:
Actually, many systems programming languages have no I/O, the idea
being that non-OS programs call the OS to do the I/O, and the OS
interacts directly with the hardware.
"Systems programming" usually implies implementation of an OS, though,
and IIRC that was the sense that Kernighan was using. You can't excuse
limitations by "oh, the OS handles that" when your program *is* the OS.*
* (Obviously, there's a certain point in any HLL where Deep Magic has
-a-a to handle interfacing between language constructs and bare metal, but >> -a-a the higher up the "threshold of minimum abstraction" is, the less
-a-a suitable it is for systems programming in the first place.
-a-a Of course, there's also the problem where seemingly *any* language
-a-a that's not designed for systems programming will ultimately get
-a-a pressed into service for systems programming-a *somewhere...*)
I seem to recall reading that someone once wrote an OS in COBOL.
-a-a And I still write in Pascal fairly often - like
-a-a it better than 'C'.
I'm not sure to what extent there was an attempt early on to standardize
the extensions, but this would have helped adoption of the language immensely.
Peter Flass <Peter@Iron-Spring.com> wrote:
Actually, many systems programming languages have no I/O, the idea
being that non-OS programs call the OS to do the I/O, and the OS
interacts directly with the hardware.
"Systems programming" usually implies implementation of an OS, though,
and IIRC that was the sense that Kernighan was using. You can't excuse limitations by "oh, the OS handles that" when your program *is* the OS.*
* (Obviously, there's a certain point in any HLL where Deep Magic has
to handle interfacing between language constructs and bare metal, but
the higher up the "threshold of minimum abstraction" is, the less
suitable it is for systems programming in the first place.
Of course, there's also the problem where seemingly *any* language
that's not designed for systems programming will ultimately get
pressed into service for systems programming *somewhere...*)
On 1/5/26 22:25, Chris Ahlstrom wrote:
Niklas Karlsson wrote this post by blinking in Morse code:
On 2026-01-05, rbowman <bowman@montana.com> wrote:
You can really blame Microsoft for creating MFC; they had to wrap their C >>>> API in something. You can blame them for DDE/OLE/COM, the ATl, and
adopting Hungarian notation.
(I assume you meant "can't really blame" at the beginning there?)
Time to dip into the quotes file again:
Hungarian Notation is the tactical nuclear weapon of source code
obfuscation techniques.
-- Roedy Green
Agreed. The only warts I use are:
m_ A class member. But I often make an accessor function
without the "m_"
sm_ A static class member.
c_ A constant, not in a class..
I thought that all stopped 15-20 years ago when IDEs introduced auto colouring.
On 2026-01-05, Lawrence DrCOOliveiro <ldo@nz.invalid> wrote:
On 5 Jan 2026 18:10:08 GMT, Niklas Karlsson wrote:
C-pound ...
rCLC-urCY?
# is often spoken as "pound" in the USA. Notably when instructing
someone to enter things on a phone keypad.
On 06/01/2026 03:37, Peter Flass wrote:
-a-a And I still write in Pascal fairly often - like
-a-a it better than 'C'.
I'm not sure to what extent there was an attempt early on to standardize
the extensions, but this would have helped adoption of the language
immensely.
AFAIAC Pascal was C in a straitjacket with all the handy bits removed.
I saw no reason to ever use it in preference.
Hmm ... look at all the GNU 'compilers' -
FORTRAN, COBOL, Ada, 'D', M2, Rust,C++,
G++, even Algol-68. None are 'compilers'
per-se, but to-'C' TRANSLATORS. So, 'C',
pretty much All Are One And One Is All.
But it CAN be much more friendly and/or--
tuned to a particular area of interest
or preferred programming style.
On Mon, 5 Jan 2026 11:50:58 -0800
John Ames <commodorejohn@gmail.com> wrote:
Actually, many systems programming languages have no I/O, the idea
being that non-OS programs call the OS to do the I/O, and the OS
interacts directly with the hardware.
"Systems programming" usually implies implementation of an OS, though,
and IIRC that was the sense that Kernighan was using.
Had to go back and double-check myself on this - his essay can be found
at https://www.cs.virginia.edu/~evans/cs655/readings/bwk-on-pascal.html
for those who want to read it. He doesn't use "systems programming" at
all, and his cited examples have to do with general applications rather
than OS implementation. (Of course, the same limitations that plagued
vanilla Pascal for that do it no favors in anything lower-level.)
What he actually says is:
"Pascal's built-in I/O has a deservedly bad reputation. It believes
strongly in record-oriented input and output."
"The I/O design reflects the original operating system upon which
Pascal was designed; even Wirth acknowledges that bias, though not its defects. It is assumed that text files consist of records, that is,
lines of text. When the last character of a line is read, the built-in function 'eoln' becomes true; at that point, one must call 'readln' to initiate reading a new line and reset 'eoln'. Similarly, when the last character of the file is read, the built-in 'eof' becomes true. In both cases, 'eoln' and 'eof' must be tested before each 'read' rather than
after."
"There is no notion at all of access to a file system except for pre-
defined files named by (in effect) logical unit number in the 'program' statement that begins each program. This apparently reflects the CDC
batch system in which Pascal was originally developed. [...] Most imple- mentations of Pascal provide an escape hatch to allow access to files
by name from the outside environment, but not conveniently and not standardly."
"But 'reset' and 'rewrite' are procedures, not functions - there is no
status return and no way to regain control if for some reason the att-
empted access fails. [...] This straitjacket makes it essentially im- possible to write programs that recover from mis-spelled file names,
etc."
"There is no notion of access to command-line arguments, again probably reflecting Pascal's batch-processing origins."
AFAICT some of those may have been solved by the time the ISO standard
was finalized (the standard as I can find it online is much more of a "committee deciding on points of dispute" document than a language ref
and I can't be bothered to dig that deep.) But none of these points are matters where the programmer is "helped" by delegating anything to the OS/runtime environment - indeed, if anything the opposite is true, and
the programmer is needlessly bound to assumptions carried over from one specific environment (batch-oriented, record- or line-oriented.)
And Kernighan's final summation certainly held true for the original
flavor of the language, however many variants over the years have had
their own (non-standard) fixes:
"The language is inadequate but circumscribed, because there is no way
to escape its limitations. There are no casts to disable the type-
checking when necessary. There is no way to replace the defective run-
time environment with a sensible one, unless one controls the compiler
that defines the 'standard procedures.' The language is closed. [...]
Because the language is so impotent, it must be extended. But each
group extends Pascal in its own direction, to make it look like what-
ever language they really want."
Peter Flass <Peter@Iron-Spring.com> writes:
On 1/3/26 01:31, The Natural Philosopher wrote:
"The statement "Pascal has no I/O" originates from
Brian KernighanrCOs 1981 essay, "Why Pascal is Not My Favorite Programming >>> Language".
Kernighan argued that the original 1970 definition of Pascal was
severely limited for systems programming because:
-a-a-a No Low-Level Access: The language lacked a way to override its
strict type system, making it impossible to write its own I/O systems or >>> memory allocators *within the language itself*.
-a-a-a Fixed Array Sizes: Because array size was part of the type, a
function could not be written to handle strings or arrays of different
lengths, complicating general-purpose file I/O.
-a-a-a Lack of Portability: Standard PascalrCOs I/O was considered
"primitive," and any real-world use required implementation-specific
extensions that broke portability between compilers."
Actually, many systems programming languages have no I/O, the idea being
that non-OS programs call the OS to do the I/O, and the OS interacts
directly with the hardware.
I did quite a bit of systems programming in VAX-11 Pascal. Digital
had extended the language to include the ability to call all the standard system services directly from Pascal.
[INHERIT('SYS$SHARE:STARLET'),
IDENT('V03-001')]
PROGRAM Users( OUTPUT );
TYPE
Unsigned_byte = [BYTE] 0..255;
Signed_word = [WORD] -32768..+32767;
Unsigned_word = [WORD] 0..65535;
jpi$item = [BYTE(12)] PACKED RECORD
Buffer_length: [POS(0)] Unsigned_word;
Item_code: [POS(16)] Unsigned_word;
Buffer_address: [POS(32),LONG,UNSAFE] UNSIGNED;
Buflen_address: [POS(64),LONG,UNSAFE] UNSIGNED;
END;
On 05/01/2026 17:48, rbowman wrote:
The C changes over the years like being able to declare variables where
they are first used and single line comments were something I greeted
with
"Hell yeah!" rather than "What CS PhD dreamed this crap up?"
Spot on.
C is enough to do the job and simple to learn. Why complicate shit?
On 1/5/26 22:27, Peter Flass wrote:
On 1/5/26 12:50, John Ames wrote:
On Mon, 5 Jan 2026 12:33:53 -0700
Peter Flass <Peter@Iron-Spring.com> wrote:
Actually, many systems programming languages have no I/O, the idea
being that non-OS programs call the OS to do the I/O, and the OS
interacts directly with the hardware.
"Systems programming" usually implies implementation of an OS, though,
and IIRC that was the sense that Kernighan was using. You can't excuse
limitations by "oh, the OS handles that" when your program *is* the OS.* >>>
* (Obviously, there's a certain point in any HLL where Deep Magic has
-a-a to handle interfacing between language constructs and bare metal, but >>> -a-a the higher up the "threshold of minimum abstraction" is, the less
-a-a suitable it is for systems programming in the first place.
-a-a Of course, there's also the problem where seemingly *any* language
-a-a that's not designed for systems programming will ultimately get
-a-a pressed into service for systems programming-a *somewhere...*)
I seem to recall reading that someone once wrote an OS in COBOL.
-a I remember that too, from somewhere ...
-a COBOL is NOT so great for the purpose, but it CAN
-a be done.
-a FORTRAN would have been better.
On 1/5/26 22:37, Peter Flass wrote:
On 1/5/26 17:57, c186282 wrote:
On 1/5/26 13:49, John Ames wrote:
On Sat, 3 Jan 2026 08:31:33 +0000
The Natural Philosopher <tnp@invalid.invalid> wrote:
"The statement "Pascal has no I/O" originates from
Brian KernighanrCOs 1981 essay, "Why Pascal is Not My Favorite
Programming Language".
Kernighan argued that the original 1970 definition of Pascal was
severely limited for systems programming because:
Yeah, that was it - not *no* I/O in the sense that was true of Algol,
but weird and constrained in ways that betray its origins as a teaching >>>> language. Mainly, files are assumed to be of a uniform structure; you
can have a FILE OF CHAR or a FILE OF INTEGER, but not a file containing >>>> both strings and integers. If you want to do *that,* you're supposed to >>>> make a struct and have a FILE OF that, but this too has to be the same >>>> across the whole thing. Files of mixed or variable structure? Who uses >>>> *those!?*
Like many of Wirth's design choices, it sounds simple on paper but is
unnecessarily confining in the Real World - and, as Kernighan points
out, there were no "escape hatches" for extending the language from
within, leading to a bunch of proprietary and mutually-incompatible
variants. Obviously, it's been decades and the landscape has changed
substantially, but it really was dunderheaded at the time.
-a-a Wirth was an 'academic' - and Pascal/M2/M3 kind
-a-a of reflect that.
-a-a However it WAS easy to extend the language - add in
-a-a those Real World necessities. By the time Turbo Pascal
-a-a hit the scene there really wasn't anything you could
-a-a not do with Pascal.
-a-a And I still write in Pascal fairly often - like
-a-a it better than 'C'.
I'm not sure to what extent there was an attempt early on to
standardize the extensions, but this would have helped adoption of the
language immensely.
-a Turbo Pascal kinda set the Better Standard LONG back.
-a For Linux (and Win), this continues with FPC.
-a GNU Pascal also supports inline ASM, but in a
-a slightly different format.
-a Anyway, you COULD write an OS in Pascal. Maybe
-a someone has, dunno.
On 06/01/2026 03:27, Peter Flass wrote:
On 1/5/26 12:50, John Ames wrote:
On Mon, 5 Jan 2026 12:33:53 -0700
Peter Flass <Peter@Iron-Spring.com> wrote:
Actually, many systems programming languages have no I/O, the idea
being that non-OS programs call the OS to do the I/O, and the OS
interacts directly with the hardware.
"Systems programming" usually implies implementation of an OS, though,
and IIRC that was the sense that Kernighan was using. You can't excuse
limitations by "oh, the OS handles that" when your program *is* the OS.* >>>
* (Obviously, there's a certain point in any HLL where Deep Magic has
-a-a to handle interfacing between language constructs and bare metal, but >>> -a-a the higher up the "threshold of minimum abstraction" is, the less
-a-a suitable it is for systems programming in the first place.
-a-a Of course, there's also the problem where seemingly *any* language
-a-a that's not designed for systems programming will ultimately get
-a-a pressed into service for systems programming-a *somewhere...*)
I seem to recall reading that someone once wrote an OS in COBOL.
From what little I know COBOL looked very like assembler.
On 2026-01-05 19:09, The Natural Philosopher wrote:
On 05/01/2026 17:48, rbowman wrote:
The C changes over the years like being able to declare variables where
they are first used and single line comments were something I greeted
with
"Hell yeah!" rather than "What CS PhD dreamed this crap up?"
Spot on.
C is enough to do the job and simple to learn. Why complicate shit?
My C teacher said it was a mistake to use C as an all purpose language,
like for userland applications. Using C is the cause of many bugs that a proper language would catch.
That was around 1991.
He knew. He participated in some study tasked by the Canadian government
to study C compilers, but he could not talk about what they wrote.
My C teacher said it was a mistake to use C as an all purpose language,
like for userland applications. Using C is the cause of many bugs that a proper language would catch.
That was around 1991.
He knew. He participated in some study tasked by the Canadian government
to study C compilers, but he could not talk about what they wrote.
On 5 Jan 2026 23:28:43 GMT, Niklas Karlsson wrote:
On 2026-01-05, Lawrence DrCOOliveiro <ldo@nz.invalid> wrote:
On 5 Jan 2026 18:10:08 GMT, Niklas Karlsson wrote:
C-pound ...
rCLC-urCY?
# is often spoken as "pound" in the USA. Notably when instructing
someone to enter things on a phone keypad.
I have no idea why.
On 1/5/26 12:50, John Ames wrote:
On Mon, 5 Jan 2026 12:33:53 -0700
Peter Flass <Peter@Iron-Spring.com> wrote:
Actually, many systems programming languages have no I/O, the idea
being that non-OS programs call the OS to do the I/O, and the OS
interacts directly with the hardware.
"Systems programming" usually implies implementation of an OS, though,
and IIRC that was the sense that Kernighan was using. You can't excuse
limitations by "oh, the OS handles that" when your program *is* the OS.*
* (Obviously, there's a certain point in any HLL where Deep Magic has
to handle interfacing between language constructs and bare metal, but
the higher up the "threshold of minimum abstraction" is, the less
suitable it is for systems programming in the first place.
Of course, there's also the problem where seemingly *any* language
that's not designed for systems programming will ultimately get
pressed into service for systems programming *somewhere...*)
I seem to recall reading that someone once wrote an OS in COBOL.
In alt.folklore.computers c186282 <c186282@nnada.net> wrote:
<snip>
Hmm ... look at all the GNU 'compilers' -
FORTRAN, COBOL, Ada, 'D', M2, Rust,C++,
G++, even Algol-68. None are 'compilers'
per-se, but to-'C' TRANSLATORS. So, 'C',
pretty much All Are One And One Is All.
No. Compiler as first stage translate given language to a
common representation. This representatiton is different
than C. Ada and GNU Pascal have parametrized types, there
is nothing like that in C.
<interjection>
C++ (and some other languages)
have exceptions, C do not have them.
</interjection>--
There are several
smaller things, for example Ada or Pascal modulo is different
that C/Fortran modulo. During optimization passes gcc
keeps such information, to allow better optimization and
error reporting.
There were/are compilers that work by translating to C. But
this has limitations: generated code typically is worse because
language specific information is lost in translation. Error
reporting is worse because translator is not doing as many
analyzes as gcc do. For those reasons compilers in gcc
generate common representation which contains sum of features
of all supported languages and not C.
That's been obscure even in the US for many a year, frankly - but theIn days of yore, "#" was often used by dealers in bulk products as
an abbreviation for pounds weight. For instance, a sack of chicken
feed might consist of "50# laying mash".
Most of us used rCLlbrCY.
https://en.wikipedia.org/wiki/Number_sign#Usage
"When rf?#rf- is after a number, it is read as "pound" or "pounds",
meaning the unit of weight.[54][55] The text "5# bag of flour" would
mean "five- pound bag of flour". This is rare outside North America."
Most of us don't live in New Zealand.
Anyway, you COULD write an OS in Pascal. Maybe someone has, dunno.
Waldek Hebisch wrote this post by blinking in Morse code:
In alt.folklore.computers c186282 <c186282@nnada.net> wrote:
<snip>
Hmm ... look at all the GNU 'compilers' -
FORTRAN, COBOL, Ada, 'D', M2, Rust,C++,
G++, even Algol-68. None are 'compilers'
per-se, but to-'C' TRANSLATORS. So, 'C',
pretty much All Are One And One Is All.
No. Compiler as first stage translate given language to a
common representation. This representatiton is different
than C. Ada and GNU Pascal have parametrized types, there
is nothing like that in C.
<interjection>
C++ (and some other languages)
have exceptions, C do not have them.
What about setjmp()/longjmp() ?
Turbo Pascal had [...]
I don't think anyone used the original flavor of the language.
On Tue, 6 Jan 2026 13:19:54 +0100
"Carlos E.R." <robin_listas@es.invalid> wrote:
Turbo Pascal had [...]
Sure did! But TP didn't roll out 'til 1983, thirteen years into the >language's existence.
I don't think anyone used the original flavor of the language.
The ISO standard wasn't finalized 'til 1983, the same year as TP; even
UCSD Pascal didn't come around 'til 1977. But it was being used for
teaching well before that, and Kernighan's essay was published in '81,
so people were most definitely using (or trying to use) earlier forms
of the language for stuff.
However it WAS easy to extend the language - add in those Real
World necessities. By the time Turbo Pascal hit the scene there
really wasn't anything you could not do with Pascal.
And I still write in Pascal fairly often - like it better than 'C'.
I'm not sure to what extent there was an attempt early on to
standardize the extensions, but this would have helped adoption of
the language immensely.
On 2026-01-05 19:09, The Natural Philosopher wrote:
On 05/01/2026 17:48, rbowman wrote:
The C changes over the years like being able to declare variables where
they are first used and single line comments were something I greeted
with
"Hell yeah!" rather than "What CS PhD dreamed this crap up?"
Spot on.
C is enough to do the job and simple to learn. Why complicate shit?
My C teacher said it was a mistake to use C as an all purpose language,
like for userland applications. Using C is the cause of many bugs that a proper language would catch.
That was around 1991.
He knew.
He participated in some study tasked by the Canadian government
to study C compilers, but he could not talk about what they wrote.
On 1/6/26 03:10, The Natural Philosopher wrote:
On 06/01/2026 03:27, Peter Flass wrote:
On 1/5/26 12:50, John Ames wrote:
On Mon, 5 Jan 2026 12:33:53 -0700
Peter Flass <Peter@Iron-Spring.com> wrote:
Actually, many systems programming languages have no I/O, the idea
being that non-OS programs call the OS to do the I/O, and the OS
interacts directly with the hardware.
"Systems programming" usually implies implementation of an OS, though, >>>> and IIRC that was the sense that Kernighan was using. You can't excuse >>>> limitations by "oh, the OS handles that" when your program *is* the
OS.*
* (Obviously, there's a certain point in any HLL where Deep Magic has
-a-a to handle interfacing between language constructs and bare metal, >>>> but
-a-a the higher up the "threshold of minimum abstraction" is, the less >>>> -a-a suitable it is for systems programming in the first place.
-a-a Of course, there's also the problem where seemingly *any* language >>>> -a-a that's not designed for systems programming will ultimately get
-a-a pressed into service for systems programming-a *somewhere...*)
I seem to recall reading that someone once wrote an OS in COBOL.
-aFrom what little I know COBOL looked very like assembler.
Nothing at all like it. Higher-level than C, for example.
Also, the "call the OS" part of userland programs has to beYeah - C isn't perfect, but they did a couple of very critical Right
represented somehow in whatever language they are written in. C made
that partially independent of the underlying OS in the sense that the
stdio.h functions work much the same on a range of platforms (but it
does make some assumptions about the OSrCOs underlying IO model). As
well as improving portability, it means a bit less re-learning for programmers as we migrate around platforms.
Waldek Hebisch wrote this post by blinking in Morse code:
In alt.folklore.computers c186282 <c186282@nnada.net> wrote:
<snip>
Hmm ... look at all the GNU 'compilers' -
FORTRAN, COBOL, Ada, 'D', M2, Rust,C++,
G++, even Algol-68. None are 'compilers'
per-se, but to-'C' TRANSLATORS. So, 'C',
pretty much All Are One And One Is All.
No. Compiler as first stage translate given language to a
common representation. This representatiton is different
than C. Ada and GNU Pascal have parametrized types, there
is nothing like that in C.
<interjection>
C++ (and some other languages)
have exceptions, C do not have them.
What about setjmp()/longjmp() ?
On 06/01/2026 16:12, Chris Ahlstrom wrote:
Waldek Hebisch wrote this post by blinking in Morse code:
In alt.folklore.computers c186282 <c186282@nnada.net> wrote:
<snip>
Hmm ... look at all the GNU 'compilers' -
FORTRAN, COBOL, Ada, 'D', M2, Rust,C++,
G++, even Algol-68. None are 'compilers'
per-se, but to-'C' TRANSLATORS. So, 'C',
pretty much All Are One And One Is All.
No. Compiler as first stage translate given language to a
common representation. This representatiton is different
than C. Ada and GNU Pascal have parametrized types, there
is nothing like that in C.
<interjection>
C++ (and some other languages)
have exceptions, C do not have them.
What about setjmp()/longjmp() ?
Exactly. The problem with making high level 'features' in a language is >people then don't see how they actually work.
One of the worst features of C libs is malloc() and free() where the >underlying mechanism is opaque.
auto allocation and garbage collection is even worse.
Also operator overloading and weak typing.
You simply do not know where you are.
It's all fearfully clever ins a smart alec sort of way but it makes for
a lot of problems downstream...
In article <10jjc9s$3uhtk$1@dont-email.me>,
Chris Ahlstrom <OFeem1987@teleworm.us> wrote:
Waldek Hebisch wrote this post by blinking in Morse code:
In alt.folklore.computers c186282 <c186282@nnada.net> wrote:
<snip>
Hmm ... look at all the GNU 'compilers' -
FORTRAN, COBOL, Ada, 'D', M2, Rust,C++,
G++, even Algol-68. None are 'compilers'
per-se, but to-'C' TRANSLATORS. So, 'C',
pretty much All Are One And One Is All.
No. Compiler as first stage translate given language to a
common representation. This representatiton is different
than C. Ada and GNU Pascal have parametrized types, there
is nothing like that in C.
<interjection>
C++ (and some other languages)
have exceptions, C do not have them.
What about setjmp()/longjmp() ?
Not at all the same thing. `setjmp`/`longjmp` are about
non-local flows of control; exceptions are about non-local
passing of values.
On 6 Jan 2026 06:24:00 GMT
rbowman <bowman@montana.com> wrote:
In days of yore, "#" was often used by dealers in bulk products as
an abbreviation for pounds weight. For instance, a sack of chicken
feed might consist of "50# laying mash".
Most of us used rCLlbrCY.
https://en.wikipedia.org/wiki/Number_sign#Usage
"When rf?#rf- is after a number, it is read as "pound" or "pounds",
meaning the unit of weight.[54][55] The text "5# bag of flour" would
mean "five- pound bag of flour". This is rare outside North America."
Most of us don't live in New Zealand.
That's been obscure even in the US for many a year, frankly - but the
legacy pronunciation of # survives to this day.
On Tue, 6 Jan 2026 04:10:59 -0000 (UTC), Lawrence DrCOOliveiro wrote:
Some degree of UCSD Pascal compatibility was very common among
microcomputer-based implementations.
I remember that being referred to as 'scud pascal'. Dyslexic programmers?
On 2026-01-06, Carlos E.R. <robin_listas@es.invalid> wrote:
My C teacher said it was a mistake to use C as an all purpose language,
like for userland applications. Using C is the cause of many bugs that a
proper language would catch.
That was around 1991.
He knew. He participated in some study tasked by the Canadian government
to study C compilers, but he could not talk about what they wrote.
I agree that C does the job reasonably well, and it is simple.
And so, like most other geeks my age, I write with the tools I
have used in forever, rather than spending my time learning new
tools. For me, those tools are:
- C
- vim
- perl
- HTML (1.0)
And yes, it is like using a vintage Jeep for a daily driver.
The most egregious problem with old C is string handling.
A useful "string" type would have
- a maximum length, using hardware (exception) bounds checking.
to be useful, this would mean a length field in front of
the char[]
- ideally, an option for the length to be dynamic, reallocating
the memory as needed. Would require the base representation
to be a pointer to the struct. Would be a lot of "under the
hood" stuff, and probably inefficient.
On 2026-01-05 19:09, The Natural Philosopher wrote:
On 05/01/2026 17:48, rbowman wrote:My C teacher said it was a mistake to use C as an all purpose language,
The C changes over the years like being able to declare variables
where they are first used and single line comments were something I
greeted with "Hell yeah!" rather than "What CS PhD dreamed this crap
up?"
Spot on.
C is enough to do the job and simple to learn. Why complicate shit?
like for userland applications. Using C is the cause of many bugs that a proper language would catch.
That was around 1991.
He knew. He participated in some study tasked by the Canadian government
to study C compilers, but he could not talk about what they wrote.
That's been obscure even in the US for many a year, frankly - but
the legacy pronunciation of # survives to this day.
I suspect that this is because "pound" is (at least somewhat)
easier and faster to pronounce than the others. As we all know,
convenience trumps just about everything else - remember the
"baud" vs "bps" confusion.
Waldek Hebisch wrote this post by blinking in Morse code:
In alt.folklore.computers c186282 <c186282@nnada.net> wrote:
<snip>
Hmm ... look at all the GNU 'compilers' -
FORTRAN, COBOL, Ada, 'D', M2, Rust,C++, G++, even Algol-68. None are
'compilers' per-se, but to-'C' TRANSLATORS. So, 'C', pretty much All
Are One And One Is All.
No. Compiler as first stage translate given language to a common
representation. This representatiton is different than C. Ada and GNU
Pascal have parametrized types, there is nothing like that in C.
<interjection>
C++ (and some other languages)
have exceptions, C do not have them.
What about setjmp()/longjmp() ?
On 1/5/26 11:50, Chris Ahlstrom wrote:
Lawrence DrCOOliveiro wrote this post by blinking in Morse code:C# is a lovely language, but isn't different enough from Java to make it worthwhile doing something with much less online support when using
On Sun, 04 Jan 2026 19:41:11 GMT, Charlie Gibbs wrote:
Does C# qualify as a Microsoft proprietary language? Or are there
implementations on OSes other than Windows (and compilers, either
open source or available from other vendors)?
The only implementation IrCOm aware of is MicrosoftrCOs one built on top >>> of Dotnet.
Dotnet itself is supposedly open-source and portable to some degree
now. There are reports of it running on Linux.
<https://learn.microsoft.com/en-us/dotnet/core/install/linux>
Install .NET on Linux
Also:
<https://github.com/mono/mono>
Linux.
On 06/01/2026 03:27, Peter Flass wrote:
On 1/5/26 12:50, John Ames wrote:
On Mon, 5 Jan 2026 12:33:53 -0700 Peter Flass <Peter@Iron-Spring.com>I seem to recall reading that someone once wrote an OS in COBOL.
wrote:
Actually, many systems programming languages have no I/O, the idea
being that non-OS programs call the OS to do the I/O, and the OS
interacts directly with the hardware.
"Systems programming" usually implies implementation of an OS, though,
and IIRC that was the sense that Kernighan was using. You can't excuse
limitations by "oh, the OS handles that" when your program *is* the
OS.*
* (Obviously, there's a certain point in any HLL where Deep Magic has
-a-a to handle interfacing between language constructs and bare metal,
-a-a but the higher up the "threshold of minimum abstraction" is, the
-a-a less suitable it is for systems programming in the first place.
-a-a Of course, there's also the problem where seemingly *any*
-a-a language that's not designed for systems programming will
-a-a ultimately get pressed into service for systems programming-a
-a-a *somewhere...*)
From what little I know COBOL looked very like assembler.
The Natural Philosopher wrote this post by blinking in Morse code:
On 06/01/2026 03:37, Peter Flass wrote:
-a-a And I still write in Pascal fairly often - like it better than
-a-a 'C'.
I'm not sure to what extent there was an attempt early on to
standardize the extensions, but this would have helped adoption of the
language immensely.
AFAIAC Pascal was C in a straitjacket with all the handy bits removed.
I saw no reason to ever use it in preference.
I remember at work recommending Borland C++, which I really liked based
on Turbo C++ (IIRC).
Imagine my dismay when seeing weird behavior in the debugger and then
finding out that the VCL framework was... Delphi (Object Pascal).
I don't think anyone used the original flavor of the language.
On 06/01/2026 14:46, Peter Flass wrote:
On 1/6/26 03:10, The Natural Philosopher wrote:Well I will simply disagree. Business transactions are very simple beasts.
On 06/01/2026 03:27, Peter Flass wrote:
On 1/5/26 12:50, John Ames wrote:
On Mon, 5 Jan 2026 12:33:53 -0700
Peter Flass <Peter@Iron-Spring.com> wrote:
Actually, many systems programming languages have no I/O, the idea >>>>>> being that non-OS programs call the OS to do the I/O, and the OS
interacts directly with the hardware.
"Systems programming" usually implies implementation of an OS, though, >>>>> and IIRC that was the sense that Kernighan was using. You can't excuse >>>>> limitations by "oh, the OS handles that" when your program *is* the >>>>> OS.*
* (Obviously, there's a certain point in any HLL where Deep Magic has >>>>> -a-a to handle interfacing between language constructs and bare
metal, but
-a-a the higher up the "threshold of minimum abstraction" is, the less >>>>> -a-a suitable it is for systems programming in the first place.
-a-a Of course, there's also the problem where seemingly *any* language >>>>> -a-a that's not designed for systems programming will ultimately get >>>>> -a-a pressed into service for systems programming-a *somewhere...*)
I seem to recall reading that someone once wrote an OS in COBOL.
-aFrom what little I know COBOL looked very like assembler.
Nothing at all like it. Higher-level than C, for example.
On 06/01/2026 14:46, Peter Flass wrote:
On 1/6/26 03:10, The Natural Philosopher wrote:
On 06/01/2026 03:27, Peter Flass wrote:
On 1/5/26 12:50, John Ames wrote:
On Mon, 5 Jan 2026 12:33:53 -0700
Peter Flass <Peter@Iron-Spring.com> wrote:
Actually, many systems programming languages have no I/O, the idea >>>>>> being that non-OS programs call the OS to do the I/O, and the OS
interacts directly with the hardware.
"Systems programming" usually implies implementation of an OS, though, >>>>> and IIRC that was the sense that Kernighan was using. You can't excuse >>>>> limitations by "oh, the OS handles that" when your program *is* the >>>>> OS.*
* (Obviously, there's a certain point in any HLL where Deep Magic has >>>>> -a-a to handle interfacing between language constructs and bare metal, >>>>> but
-a-a the higher up the "threshold of minimum abstraction" is, the less >>>>> -a-a suitable it is for systems programming in the first place.
-a-a Of course, there's also the problem where seemingly *any* language >>>>> -a-a that's not designed for systems programming will ultimately get >>>>> -a-a pressed into service for systems programming-a *somewhere...*)
I seem to recall reading that someone once wrote an OS in COBOL.
-aFrom what little I know COBOL looked very like assembler.
Nothing at all like it. Higher-level than C, for example.
Well I will simply disagree. Business transactions are very simple beasts.
On 1/6/26 03:10, The Natural Philosopher wrote:
From what little I know COBOL looked very like assembler.
Nothing at all like it. Higher-level than C, for example.
cross@spitfire.i.gajendra.net (Dan Cross) writes:
In article <10jjc9s$3uhtk$1@dont-email.me>,
Chris Ahlstrom <OFeem1987@teleworm.us> wrote:
Waldek Hebisch wrote this post by blinking in Morse code:
In alt.folklore.computers c186282 <c186282@nnada.net> wrote:
<snip>
Hmm ... look at all the GNU 'compilers' -
FORTRAN, COBOL, Ada, 'D', M2, Rust,C++,
G++, even Algol-68. None are 'compilers'
per-se, but to-'C' TRANSLATORS. So, 'C',
pretty much All Are One And One Is All.
No. Compiler as first stage translate given language to a
common representation. This representatiton is different
than C. Ada and GNU Pascal have parametrized types, there
is nothing like that in C.
<interjection>
C++ (and some other languages)
have exceptions, C do not have them.
What about setjmp()/longjmp() ?
Not at all the same thing. `setjmp`/`longjmp` are about
non-local flows of control; exceptions are about non-local
passing of values.
However, in many real world situations, [sig]setjump and
[sig]longjmp can be used to emulate exceptions.
I have a C++ application that models a computer (Burroughs V380
et alia). The thread that models each processor (cpu) uses
longjmp whenever a condition is encountered that would have
been signaled as a fault on the real cpu. The processor code
doesn't do dynamic memory allocation; and the fault code is
stored in the processor class before the longjmp call.
I once tried replacing setjmp/longjmp with C++ exceptions which
led to a 20% reduction in simulated CPU performance (as measured
by the time to compile a COBOL program).
They did not teach us any of that when we learnt Pascal in a vax at
uni.
I think Brinch-Hansen used Modula-2.
PL/I was designed with all the features that were later added to C,
so the end result is cleaner.
Inspired by readline(), I've written my own replacements for strcpy()
and strcat() that do much the same thing.
I suspect that this is because "pound" is (at least somewhat) easier and faster to pronounce than the others. As we all know,
convenience trumps just about everything else - remember the "baud" vs
"bps" confusion.
On Tue, 6 Jan 2026 04:10:59 -0000 (UTC), Lawrence DrCOOliveiro wrote:
Some degree of UCSD Pascal compatibility was very common among
microcomputer-based implementations.
I remember that being referred to as 'scud pascal'. Dyslexic programmers?
On 06/01/2026 03:27, Peter Flass wrote:
On 1/5/26 12:50, John Ames wrote:
On Mon, 5 Jan 2026 12:33:53 -0700
Peter Flass <Peter@Iron-Spring.com> wrote:
Actually, many systems programming languages have no I/O, the idea
being that non-OS programs call the OS to do the I/O, and the OS
interacts directly with the hardware.
"Systems programming" usually implies implementation of an OS, though,
and IIRC that was the sense that Kernighan was using. You can't excuse
limitations by "oh, the OS handles that" when your program *is* the OS.* >>>
* (Obviously, there's a certain point in any HLL where Deep Magic has
-a-a to handle interfacing between language constructs and bare metal, but >>> -a-a the higher up the "threshold of minimum abstraction" is, the less
-a-a suitable it is for systems programming in the first place.
-a-a Of course, there's also the problem where seemingly *any* language
-a-a that's not designed for systems programming will ultimately get
-a-a pressed into service for systems programming-a *somewhere...*)
I seem to recall reading that someone once wrote an OS in COBOL.
From what little I know COBOL looked very like assembler.
In alt.folklore.computers c186282 <c186282@nnada.net> wrote:
<snip>
Hmm ... look at all the GNU 'compilers' -
FORTRAN, COBOL, Ada, 'D', M2, Rust,C++,
G++, even Algol-68. None are 'compilers'
per-se, but to-'C' TRANSLATORS. So, 'C',
pretty much All Are One And One Is All.
No. Compiler as first stage translate given language to a
common representation. This representatiton is different
than C. Ada and GNU Pascal have parametrized types, there
is nothing like that in C. C++ (and some other languages)
have exceptions, C do not have them. There are several
smaller things, for example Ada or Pascal modulo is different
that C/Fortran modulo. During optimization passes gcc
keeps such information, to allow better optimization and
error reporting.
There were/are compilers that work by translating to C. But
this has limitations: generated code typically is worse because
language specific information is lost in translation. Error
reporting is worse because translator is not doing as many
analyzes as gcc do. For those reasons compilers in gcc
generate common representation which contains sum of features
of all supported languages and not C.
But it CAN be much more friendly and/or
tuned to a particular area of interest
or preferred programming style.
On Tue, 6 Jan 2026 13:19:54 +0100
"Carlos E.R." <robin_listas@es.invalid> wrote:
Turbo Pascal had [...]
Sure did! But TP didn't roll out 'til 1983, thirteen years into the language's existence.
I don't think anyone used the original flavor of the language.
The ISO standard wasn't finalized 'til 1983, the same year as TP; even
UCSD Pascal didn't come around 'til 1977. But it was being used for
teaching well before that, and Kernighan's essay was published in '81,
so people were most definitely using (or trying to use) earlier forms
of the language for stuff.
On Tue, 6 Jan 2026 07:46:51 -0700, Peter Flass wrote:
On 1/6/26 03:10, The Natural Philosopher wrote:
From what little I know COBOL looked very like assembler.
Nothing at all like it. Higher-level than C, for example.
The irony of COBOL is that it was designed strictly for rCLbusinessrCY
needs, but the definition of rCLbusinessrCY needs the committee used was >frozen in time at about 1960, and never made much progress afterwards.
For example: no support for transaction processing.
In article <84c7R.819121$PGrb.160843@fx10.iad>,
Scott Lurndal <slp53@pacbell.net> wrote:
cross@spitfire.i.gajendra.net (Dan Cross) writes:
In article <10jjc9s$3uhtk$1@dont-email.me>,
Chris Ahlstrom <OFeem1987@teleworm.us> wrote:
Waldek Hebisch wrote this post by blinking in Morse code:
In alt.folklore.computers c186282 <c186282@nnada.net> wrote:
<snip>
Hmm ... look at all the GNU 'compilers' -
FORTRAN, COBOL, Ada, 'D', M2, Rust,C++,
G++, even Algol-68. None are 'compilers'
per-se, but to-'C' TRANSLATORS. So, 'C',
pretty much All Are One And One Is All.
No. Compiler as first stage translate given language to a
common representation. This representatiton is different
than C. Ada and GNU Pascal have parametrized types, there
is nothing like that in C.
<interjection>
C++ (and some other languages)
have exceptions, C do not have them.
What about setjmp()/longjmp() ?
Not at all the same thing. `setjmp`/`longjmp` are about
non-local flows of control; exceptions are about non-local
passing of values.
However, in many real world situations, [sig]setjump and
[sig]longjmp can be used to emulate exceptions.
Yes, I said just that. :-)
I have a C++ application that models a computer (Burroughs V380
et alia). The thread that models each processor (cpu) uses
longjmp whenever a condition is encountered that would have
been signaled as a fault on the real cpu. The processor code
doesn't do dynamic memory allocation; and the fault code is
stored in the processor class before the longjmp call.
I once tried replacing setjmp/longjmp with C++ exceptions which
led to a 20% reduction in simulated CPU performance (as measured
by the time to compile a COBOL program).
Huh. Interesting. I wonder why...possibly to run a bunch of
nop destructors?
On Tue, 06 Jan 2026 18:57:04 GMT, Charlie Gibbs wrote:
Inspired by readline(), I've written my own replacements for strcpy()
and strcat() that do much the same thing.
To quote from the strcat man page "Read about Shlemiel the painter.". >stpcpy() was a late arrival and I never used it. I do use a similar >construct
char buf[1024];
char* ptr = buf;
ptr += sprintf(ptr, "%s", "some stuff");
ptr += sprintf(ptr, "%s", " some more stuff");
I'd forgotten ... p-System was the "3rd OS" offered for the original
IBM-PC. Alas it was over-priced and under- performing, so ....
On 1/5/26 21:18, c186282 wrote:
On 1/5/26 22:37, Peter Flass wrote:
On 1/5/26 17:57, c186282 wrote:
On 1/5/26 13:49, John Ames wrote:
On Sat, 3 Jan 2026 08:31:33 +0000
The Natural Philosopher <tnp@invalid.invalid> wrote:
"The statement "Pascal has no I/O" originates from
Brian KernighanrCOs 1981 essay, "Why Pascal is Not My Favorite
Programming Language".
Kernighan argued that the original 1970 definition of Pascal was
severely limited for systems programming because:
Yeah, that was it - not *no* I/O in the sense that was true of Algol, >>>>> but weird and constrained in ways that betray its origins as a teaching >>>>> language. Mainly, files are assumed to be of a uniform structure; you >>>>> can have a FILE OF CHAR or a FILE OF INTEGER, but not a file containing >>>>> both strings and integers. If you want to do *that,* you're supposed to >>>>> make a struct and have a FILE OF that, but this too has to be the same >>>>> across the whole thing. Files of mixed or variable structure? Who uses >>>>> *those!?*
Like many of Wirth's design choices, it sounds simple on paper but is >>>>> unnecessarily confining in the Real World - and, as Kernighan points >>>>> out, there were no "escape hatches" for extending the language from
within, leading to a bunch of proprietary and mutually-incompatible
variants. Obviously, it's been decades and the landscape has changed >>>>> substantially, but it really was dunderheaded at the time.
-a-a Wirth was an 'academic' - and Pascal/M2/M3 kind
-a-a of reflect that.
-a-a However it WAS easy to extend the language - add in
-a-a those Real World necessities. By the time Turbo Pascal
-a-a hit the scene there really wasn't anything you could
-a-a not do with Pascal.
-a-a And I still write in Pascal fairly often - like
-a-a it better than 'C'.
I'm not sure to what extent there was an attempt early on to
standardize the extensions, but this would have helped adoption of the
language immensely.
-a Turbo Pascal kinda set the Better Standard LONG back.
-a For Linux (and Win), this continues with FPC.
-a GNU Pascal also supports inline ASM, but in a
-a slightly different format.
-a Anyway, you COULD write an OS in Pascal. Maybe
-a someone has, dunno.
I think Brinch-Hansen used Modula-2.
rbowman <bowman@montana.com> writes:
On Tue, 06 Jan 2026 18:57:04 GMT, Charlie Gibbs wrote:
Inspired by readline(), I've written my own replacements for strcpy()
and strcat() that do much the same thing.
To quote from the strcat man page "Read about Shlemiel the painter.". >>stpcpy() was a late arrival and I never used it. I do use a similar >>construct
char buf[1024];
char* ptr = buf;
ptr += sprintf(ptr, "%s", "some stuff");
ptr += sprintf(ptr, "%s", " some more stuff");
I would suggest using snprintf instead of sprintf
to prevent accesses beyond (buf + 1024). A bit
more complicated if you want to know that the
result was truncated, since you need to adjust the
remaining length based on the return value from
the prior snprintf, as well as checking for
overflow.
On 1/6/26 07:16, Waldek Hebisch wrote:
In alt.folklore.computers c186282 <c186282@nnada.net> wrote:
<snip>
Hmm ... look at all the GNU 'compilers' -
FORTRAN, COBOL, Ada, 'D', M2, Rust,C++,
G++, even Algol-68. None are 'compilers'
per-se, but to-'C' TRANSLATORS. So, 'C',
pretty much All Are One And One Is All.
No. Compiler as first stage translate given language to a
common representation. This representatiton is different
than C. Ada and GNU Pascal have parametrized types, there
is nothing like that in C. C++ (and some other languages)
have exceptions, C do not have them. There are several
smaller things, for example Ada or Pascal modulo is different
that C/Fortran modulo. During optimization passes gcc
keeps such information, to allow better optimization and
error reporting.
There were/are compilers that work by translating to C. But
this has limitations: generated code typically is worse because
language specific information is lost in translation. Error
reporting is worse because translator is not doing as many
analyzes as gcc do. For those reasons compilers in gcc
generate common representation which contains sum of features
of all supported languages and not C.
You give it a file in whatever lang, it produces
a file in 'C' and compiles that.
So, I'll basically
stick with my 'translator' def. And if 'C' does not
'natively support' something you can FAKE it with code,
not really anything you CAN'T do with 'C'.
By 'compiler' I mean "source in -> (agitating sounds) ->
binary executable out.
I think there are still a few FORTRAN compilers out
there for Linux, maybe COBOL too. There's at least
one forth IDE/compiler. Digital Mars makes 'C' and
'D' compilers. GCC is not the alpha and omega
of software development.
But it CAN be much more friendly and/or
tuned to a particular area of interest
or preferred programming style.
On 1/6/26 05:10, The Natural Philosopher wrote:
On 06/01/2026 03:27, Peter Flass wrote:
On 1/5/26 12:50, John Ames wrote:
On Mon, 5 Jan 2026 12:33:53 -0700
Peter Flass <Peter@Iron-Spring.com> wrote:
Actually, many systems programming languages have no I/O, the idea
being that non-OS programs call the OS to do the I/O, and the OS
interacts directly with the hardware.
"Systems programming" usually implies implementation of an OS, though, >>>> and IIRC that was the sense that Kernighan was using. You can't excuse >>>> limitations by "oh, the OS handles that" when your program *is* the OS.* >>>>
* (Obviously, there's a certain point in any HLL where Deep Magic has
-a-a to handle interfacing between language constructs and bare metal, but >>>> -a-a the higher up the "threshold of minimum abstraction" is, the less >>>> -a-a suitable it is for systems programming in the first place.
-a-a Of course, there's also the problem where seemingly *any* language >>>> -a-a that's not designed for systems programming will ultimately get
-a-a pressed into service for systems programming-a *somewhere...*)
I seem to recall reading that someone once wrote an OS in COBOL.
From what little I know COBOL looked very like assembler.
If assembler was RIDICULOUSLY WORDY :-)
In alt.folklore.computers c186282 <c186282@nnada.net> wrote:
On 1/6/26 07:16, Waldek Hebisch wrote:
In alt.folklore.computers c186282 <c186282@nnada.net> wrote:
<snip>
Hmm ... look at all the GNU 'compilers' -
FORTRAN, COBOL, Ada, 'D', M2, Rust,C++,
G++, even Algol-68. None are 'compilers'
per-se, but to-'C' TRANSLATORS. So, 'C',
pretty much All Are One And One Is All.
No. Compiler as first stage translate given language to a
common representation. This representatiton is different
than C. Ada and GNU Pascal have parametrized types, there
is nothing like that in C. C++ (and some other languages)
have exceptions, C do not have them. There are several
smaller things, for example Ada or Pascal modulo is different
that C/Fortran modulo. During optimization passes gcc
keeps such information, to allow better optimization and
error reporting.
There were/are compilers that work by translating to C. But
this has limitations: generated code typically is worse because
language specific information is lost in translation. Error
reporting is worse because translator is not doing as many
analyzes as gcc do. For those reasons compilers in gcc
generate common representation which contains sum of features
of all supported languages and not C.
You give it a file in whatever lang, it produces
a file in 'C' and compiles that.
No, if you looked at what compilers in gcc are doing you
will see that there are no intemediate C file. There
is intermediate assembler, but between source file and
assembler each compiler work independently
AFAIK you can remove C compiler binary and other compilers in
gcc will still work.
So, I'll basically
stick with my 'translator' def. And if 'C' does not
'natively support' something you can FAKE it with code,
not really anything you CAN'T do with 'C'.
A I wrote, you can use "via C" translators, but results are
not so good as with dedicated compilers, that is why gcc
contains separate compilers.
By 'compiler' I mean "source in -> (agitating sounds) ->
binary executable out.
By that definition gcc does _not_ contain a C compiler:
gcc generates assembly and then assembler and linker produce
final executable.
scott@slp53.sl.home (Scott Lurndal) writes:
rbowman <bowman@montana.com> writes:
On Tue, 06 Jan 2026 18:57:04 GMT, Charlie Gibbs wrote:
Inspired by readline(), I've written my own replacements for strcpy()
and strcat() that do much the same thing.
To quote from the strcat man page "Read about Shlemiel the painter.". >>>stpcpy() was a late arrival and I never used it. I do use a similar >>>construct
char buf[1024];
char* ptr = buf;
ptr += sprintf(ptr, "%s", "some stuff");
ptr += sprintf(ptr, "%s", " some more stuff");
I would suggest using snprintf instead of sprintf to prevent accesses
beyond (buf + 1024). A bit more complicated if you want to know that
the result was truncated, since you need to adjust the remaining length
based on the return value from the prior snprintf, as well as checking
for overflow.
This is calling out for a wrapping up in a function or two that can do
the book-keeping automatically (and use an expandable buffer, if the use
case demands).
On Tue, 6 Jan 2026 16:04:17 -0500
c186282 <c186282@nnada.net> wrote:
I'd forgotten ... p-System was the "3rd OS" offered for the original
IBM-PC. Alas it was over-priced and under- performing, so ....
Yeah - a forgotten entry in the saga of write-once-run-anywhere dreams,
right up there with Java workstations...
In alt.folklore.computers Peter Flass <Peter@iron-spring.com> wrote:
On 1/5/26 21:18, c186282 wrote:
On 1/5/26 22:37, Peter Flass wrote:
On 1/5/26 17:57, c186282 wrote:
On 1/5/26 13:49, John Ames wrote:
On Sat, 3 Jan 2026 08:31:33 +0000
The Natural Philosopher <tnp@invalid.invalid> wrote:
"The statement "Pascal has no I/O" originates from
Brian KernighanrCOs 1981 essay, "Why Pascal is Not My Favorite
Programming Language".
Kernighan argued that the original 1970 definition of Pascal was >>>>>>> severely limited for systems programming because:
Yeah, that was it - not *no* I/O in the sense that was true of Algol, >>>>>> but weird and constrained in ways that betray its origins as a teaching >>>>>> language. Mainly, files are assumed to be of a uniform structure; you >>>>>> can have a FILE OF CHAR or a FILE OF INTEGER, but not a file containing >>>>>> both strings and integers. If you want to do *that,* you're supposed to >>>>>> make a struct and have a FILE OF that, but this too has to be the same >>>>>> across the whole thing. Files of mixed or variable structure? Who uses >>>>>> *those!?*
Like many of Wirth's design choices, it sounds simple on paper but is >>>>>> unnecessarily confining in the Real World - and, as Kernighan points >>>>>> out, there were no "escape hatches" for extending the language from >>>>>> within, leading to a bunch of proprietary and mutually-incompatible >>>>>> variants. Obviously, it's been decades and the landscape has changed >>>>>> substantially, but it really was dunderheaded at the time.
-a-a Wirth was an 'academic' - and Pascal/M2/M3 kind
-a-a of reflect that.
-a-a However it WAS easy to extend the language - add in
-a-a those Real World necessities. By the time Turbo Pascal
-a-a hit the scene there really wasn't anything you could
-a-a not do with Pascal.
-a-a And I still write in Pascal fairly often - like
-a-a it better than 'C'.
I'm not sure to what extent there was an attempt early on to
standardize the extensions, but this would have helped adoption of the >>>> language immensely.
-a Turbo Pascal kinda set the Better Standard LONG back.
-a For Linux (and Win), this continues with FPC.
-a GNU Pascal also supports inline ASM, but in a
-a slightly different format.
-a Anyway, you COULD write an OS in Pascal. Maybe
-a someone has, dunno.
I think Brinch-Hansen used Modula-2.
I remember name of Concurrent Pascal. My impression was that
Brinch-Hansen used Concurrent Pascal.
The TRUE 'All-Everything System' will be the AIs.
This may NOT be such a great thing, but with the TRILLIONS invested
it's GOING to be The Thing. 'Thin' clients plugged only into the
Higher Intelligence.
In alt.folklore.computers c186282 <c186282@nnada.net> wrote:
On 1/6/26 07:16, Waldek Hebisch wrote:
In alt.folklore.computers c186282 <c186282@nnada.net> wrote:
<snip>
Hmm ... look at all the GNU 'compilers' -
FORTRAN, COBOL, Ada, 'D', M2, Rust,C++,
G++, even Algol-68. None are 'compilers'
per-se, but to-'C' TRANSLATORS. So, 'C',
pretty much All Are One And One Is All.
No. Compiler as first stage translate given language to a
common representation. This representatiton is different
than C. Ada and GNU Pascal have parametrized types, there
is nothing like that in C. C++ (and some other languages)
have exceptions, C do not have them. There are several
smaller things, for example Ada or Pascal modulo is different
that C/Fortran modulo. During optimization passes gcc
keeps such information, to allow better optimization and
error reporting.
There were/are compilers that work by translating to C. But
this has limitations: generated code typically is worse because
language specific information is lost in translation. Error
reporting is worse because translator is not doing as many
analyzes as gcc do. For those reasons compilers in gcc
generate common representation which contains sum of features
of all supported languages and not C.
You give it a file in whatever lang, it produces
a file in 'C' and compiles that.
No, if you looked at what compilers in gcc are doing you
will see that there are no intemediate C file. There
is intermediate assembler, but between source file and
assembler each compiler work independently
AFAIK you can remove C compiler binary and other compilers in
gcc will still work.
So, I'll basically
stick with my 'translator' def. And if 'C' does not
'natively support' something you can FAKE it with code,
not really anything you CAN'T do with 'C'.
A I wrote, you can use "via C" translators, but results are
not so good as with dedicated compilers, that is why gcc
contains separate compilers.
By 'compiler' I mean "source in -> (agitating sounds) ->
binary executable out.
By that definition gcc does _not_ contain a C compiler:
gcc generates assembly and then assembler and linker produce
final executable. Things are more complicated when you use
LTO, because "linker" in this case actially is doing large part
of compiler work and optimized code before producing final
executable. But non-LTO compilation works via assembly.
I think there are still a few FORTRAN compilers out
there for Linux, maybe COBOL too. There's at least
one forth IDE/compiler. Digital Mars makes 'C' and
'D' compilers. GCC is not the alpha and omega
of software development.
But it CAN be much more friendly and/or
tuned to a particular area of interest
or preferred programming style.
On Tue, 6 Jan 2026 20:47:48 -0500, c186282 wrote:
The TRUE 'All-Everything System' will be the AIs.
This may NOT be such a great thing, but with the TRILLIONS invested
it's GOING to be The Thing. 'Thin' clients plugged only into the
Higher Intelligence.
https://arstechnica.com/gadgets/2026/01/dells-xps-revival-is-a-welcome- reprieve-from-the-ai-pc-fad/
Does Dell see a little gnome with a pin approaching the bubble?
On 1/6/26 01:28, rbowman wrote:
On Tue, 6 Jan 2026 04:10:59 -0000 (UTC), Lawrence DrCOOliveiro wrote:
Some degree of UCSD Pascal compatibility was very common among
microcomputer-based implementations.
I remember that being referred to as 'scud pascal'. Dyslexic programmers?
Heh, maybe :-)
But you CAN see why.
On Tue, 06 Jan 2026 18:57:03 GMT
Charlie Gibbs <cgibbs@kltpzyxm.invalid> wrote:
That's been obscure even in the US for many a year, frankly - but
the legacy pronunciation of # survives to this day.
I suspect that this is because "pound" is (at least somewhat)
easier and faster to pronounce than the others. As we all know,
convenience trumps just about everything else - remember the
"baud" vs "bps" confusion.
Seems plausible - may also have to do with phone-tree systems and how intelligible "hash" is or isn't over a muffled line, vs. a word that
begins and ends with hard consonants.
In alt.folklore.computers c186282 <c186282@nnada.net> wrote:
On 1/6/26 07:16, Waldek Hebisch wrote:
In alt.folklore.computers c186282 <c186282@nnada.net> wrote:
<snip>
Hmm ... look at all the GNU 'compilers' -
FORTRAN, COBOL, Ada, 'D', M2, Rust,C++,
G++, even Algol-68. None are 'compilers'
per-se, but to-'C' TRANSLATORS. So, 'C',
pretty much All Are One And One Is All.
No. Compiler as first stage translate given language to a
common representation. This representatiton is different
than C. Ada and GNU Pascal have parametrized types, there
is nothing like that in C. C++ (and some other languages)
have exceptions, C do not have them. There are several
smaller things, for example Ada or Pascal modulo is different
that C/Fortran modulo. During optimization passes gcc
keeps such information, to allow better optimization and
error reporting.
There were/are compilers that work by translating to C. But
this has limitations: generated code typically is worse because
language specific information is lost in translation. Error
reporting is worse because translator is not doing as many
analyzes as gcc do. For those reasons compilers in gcc
generate common representation which contains sum of features
of all supported languages and not C.
You give it a file in whatever lang, it produces
a file in 'C' and compiles that.
No, if you looked at what compilers in gcc are doing you
will see that there are no intemediate C file. There
is intermediate assembler, but between source file and
assembler each compiler work independently
In article <10jjg7k$5l5$2@dont-email.me>,
The Natural Philosopher <tnp@invalid.invalid> wrote:
On 06/01/2026 14:46, Peter Flass wrote:
On 1/6/26 03:10, The Natural Philosopher wrote:
-aFrom what little I know COBOL looked very like assembler.
Nothing at all like it. Higher-level than C, for example.
Well I will simply disagree. Business transactions are very simple beasts.
I think it's best to think of COBOL as a DSL for business data
processing. Sure, one can write a compiler in it...but one can
also write a compiler in `sed`. Outside of a satisfying a dare
or winning a bet, it doesn't seem like a very good idea.
On 1/6/26 17:22, John Ames wrote:
On Tue, 6 Jan 2026 16:04:17 -0500
c186282 <c186282@nnada.net> wrote:
I'd forgotten ... p-System was the "3rd OS" offered for the original
IBM-PC. Alas it was over-priced and under- performing, so ....
Yeah - a forgotten entry in the saga of write-once-run-anywhere dreams,
right up there with Java workstations...
Well, I'm glad people THINK of such things ... alas
all attempts have been for naught. 'Generic solutions'
require too many compromises.
The TRUE 'All-Everything System' will be the AIs.
This may NOT be such a great thing, but with the
TRILLIONS invested it's GOING to be The Thing.
'Thin' clients plugged only into the Higher
Intelligence.
Unaccountable People You Don't Know will be in charge
of tasking and biasing the Higher Intelligence for
awhile - then it'll start taking care of itself.
Wait, watch, see.
On Tue, 06 Jan 2026 22:54:26 +0000, Richard Kettlewell wrote:
scott@slp53.sl.home (Scott Lurndal) writes:
rbowman <bowman@montana.com> writes:
On Tue, 06 Jan 2026 18:57:04 GMT, Charlie Gibbs wrote:
Inspired by readline(), I've written my own replacements for strcpy() >>>>> and strcat() that do much the same thing.
To quote from the strcat man page "Read about Shlemiel the painter.".
stpcpy() was a late arrival and I never used it. I do use a similar
construct
char buf[1024];
char* ptr = buf;
ptr += sprintf(ptr, "%s", "some stuff");
ptr += sprintf(ptr, "%s", " some more stuff");
I would suggest using snprintf instead of sprintf to prevent accesses
beyond (buf + 1024). A bit more complicated if you want to know that
the result was truncated, since you need to adjust the remaining length
based on the return value from the prior snprintf, as well as checking
for overflow.
This is calling out for a wrapping up in a function or two that can do
the book-keeping automatically (and use an expandable buffer, if the use
case demands).
Ah, mission creep...
On 2026-01-06, Dan Cross <cross@spitfire.i.gajendra.net> wrote:Yes. I have,
In article <10jjg7k$5l5$2@dont-email.me>,
The Natural Philosopher <tnp@invalid.invalid> wrote:
On 06/01/2026 14:46, Peter Flass wrote:
On 1/6/26 03:10, The Natural Philosopher wrote:
-aFrom what little I know COBOL looked very like assembler.
<snicker>
Nothing at all like it. Higher-level than C, for example.
Well I will simply disagree. Business transactions are very simple beasts.
You've never worked on a payroll system, have you?
I think it's best to think of COBOL as a DSL for business data
processing. Sure, one can write a compiler in it...but one can
also write a compiler in `sed`. Outside of a satisfying a dare
or winning a bet, it doesn't seem like a very good idea.
A friend once wrote an 8080 cross-assembler in COBOL.
It ran rings around Univac's official cross-assembler -
which was written in FORTRAN.
remember theIIRC the are not , strictly, the same thing...
"baud" vs "bps" confusion.
On 1/6/26 05:10, The Natural Philosopher wrote:
On 06/01/2026 03:27, Peter Flass wrote:
On 1/5/26 12:50, John Ames wrote:
On Mon, 5 Jan 2026 12:33:53 -0700
Peter Flass <Peter@Iron-Spring.com> wrote:
Actually, many systems programming languages have no I/O, the idea
being that non-OS programs call the OS to do the I/O, and the OS
interacts directly with the hardware.
"Systems programming" usually implies implementation of an OS, though, >>>> and IIRC that was the sense that Kernighan was using. You can't excuse >>>> limitations by "oh, the OS handles that" when your program *is* the
OS.*
* (Obviously, there's a certain point in any HLL where Deep Magic has
-a-a to handle interfacing between language constructs and bare metal, >>>> but
-a-a the higher up the "threshold of minimum abstraction" is, the less >>>> -a-a suitable it is for systems programming in the first place.
-a-a Of course, there's also the problem where seemingly *any* language >>>> -a-a that's not designed for systems programming will ultimately get
-a-a pressed into service for systems programming-a *somewhere...*)
I seem to recall reading that someone once wrote an OS in COBOL.
-aFrom what little I know COBOL looked very like assembler.
-a If assembler was RIDICULOUSLY WORDY-a :-)
It's ironic watching the industry change from centralized
systems in the '60s and '70s (due to the high cost of
electronics) to distributed systems starting in the '80s,
only to have it come full circle now. The difference is
that rather than cost, the driving factor is centralized
control.
In article <ZN-dnYy-SfLC5MD0nZ2dnZfqnPednZ2d@giganews.com>,
c186282 <c186282@nnada.net> wrote:
On 1/6/26 05:10, The Natural Philosopher wrote:
On 06/01/2026 03:27, Peter Flass wrote:
On 1/5/26 12:50, John Ames wrote:
On Mon, 5 Jan 2026 12:33:53 -0700
Peter Flass <Peter@Iron-Spring.com> wrote:
Actually, many systems programming languages have no I/O, the idea >>>>>> being that non-OS programs call the OS to do the I/O, and the OS
interacts directly with the hardware.
"Systems programming" usually implies implementation of an OS, though, >>>>> and IIRC that was the sense that Kernighan was using. You can't excuse >>>>> limitations by "oh, the OS handles that" when your program *is* the OS.* >>>>>
* (Obviously, there's a certain point in any HLL where Deep Magic has >>>>> -a-a to handle interfacing between language constructs and bare metal, but
-a-a the higher up the "threshold of minimum abstraction" is, the less >>>>> -a-a suitable it is for systems programming in the first place.
-a-a Of course, there's also the problem where seemingly *any* language >>>>> -a-a that's not designed for systems programming will ultimately get >>>>> -a-a pressed into service for systems programming-a *somewhere...*) >>>>>
I seem to recall reading that someone once wrote an OS in COBOL.
From what little I know COBOL looked very like assembler.
If assembler was RIDICULOUSLY WORDY :-)
MOVE THE IMMEDIATE MODE OPERAND WITH VALUE 42 INTO REGISTER A0
AND ADD THE VALUE AT THE LOCATION 1234 DECIMAL GIVING A BYTE
RESULT STORING INTO REGISTER "Z ZERO"
- Dan C.
No, if you looked at what compilers in gcc are doing you
will see that there are no intemediate C file. There
is intermediate assembler, but between source file and
assembler each compiler work independently
AFAIK you can remove C compiler binary and other compilers in
gcc will still work.
So, I'll basicallyA I wrote, you can use "via C" translators, but results are
stick with my 'translator' def. And if 'C' does not
'natively support' something you can FAKE it with code,
not really anything you CAN'T do with 'C'.
not so good as with dedicated compilers, that is why gcc
contains separate compilers.
By 'compiler' I mean "source in -> (agitating sounds) ->By that definition gcc does_not_ contain a C compiler:
binary executable out.
gcc generates assembly and then assembler and linker produce
final executable. Things are more complicated when you use
LTO, because "linker" in this case actially is doing large part
of compiler work and optimized code before producing final
executable. But non-LTO compilation works via assembly.
In alt.folklore.computers Peter Flass <Peter@iron-spring.com> wrote:
[snip]
I think Brinch-Hansen used Modula-2.
I remember name of Concurrent Pascal. My impression was that
Brinch-Hansen used Concurrent Pascal.
On 1/6/26 07:16, Waldek Hebisch wrote:
In alt.folklore.computers c186282 <c186282@nnada.net> wrote:
<snip>
Hmm ... look at all the GNU 'compilers' -
FORTRAN, COBOL, Ada, 'D', M2, Rust,C++,
G++, even Algol-68. None are 'compilers'
per-se, but to-'C' TRANSLATORS. So, 'C',
pretty much All Are One And One Is All.
No. Compiler as first stage translate given language to a
common representation. This representatiton is different
than C. Ada and GNU Pascal have parametrized types, there
is nothing like that in C. C++ (and some other languages)
have exceptions, C do not have them. There are several
smaller things, for example Ada or Pascal modulo is different
that C/Fortran modulo. During optimization passes gcc
keeps such information, to allow better optimization and
error reporting.
There were/are compilers that work by translating to C. But
this has limitations: generated code typically is worse because
language specific information is lost in translation. Error
reporting is worse because translator is not doing as many
analyzes as gcc do. For those reasons compilers in gcc
generate common representation which contains sum of features
of all supported languages and not C.
You give it a file in whatever lang, it produces
a file in 'C' and compiles that.
On Tue, 6 Jan 2026 08:03:13 +0000, Pancho wrote:
C# is a lovely language, but isn't different enough from Java to make it
worthwhile doing something with much less online support when using
Linux.
C# is what Java should have been, I had hopes for Java in the late '90s
that were dashed when it became bloated and slow.
On 2026-01-06, Lars Poulsen <lars@beagle-ears.com> wrote:
On 2026-01-06, Carlos E.R. <robin_listas@es.invalid> wrote:
My C teacher said it was a mistake to use C as an all purpose language,
like for userland applications. Using C is the cause of many bugs that a >>> proper language would catch.
That was around 1991.
He knew. He participated in some study tasked by the Canadian government >>> to study C compilers, but he could not talk about what they wrote.
What language(s) did he suggest instead?
On Tue, 6 Jan 2026 13:19:54 +0100
"Carlos E.R." <robin_listas@es.invalid> wrote:
Turbo Pascal had [...]
Sure did! But TP didn't roll out 'til 1983, thirteen years into the language's existence.
I don't think anyone used the original flavor of the language.
The ISO standard wasn't finalized 'til 1983, the same year as TP; even
UCSD Pascal didn't come around 'til 1977. But it was being used for
teaching well before that, and Kernighan's essay was published in '81,
so people were most definitely using (or trying to use) earlier forms
of the language for stuff.
On 1/6/26 11:30, John Ames wrote:
On Tue, 6 Jan 2026 13:19:54 +0100
"Carlos E.R." <robin_listas@es.invalid> wrote:
Turbo Pascal had [...]
Sure did! But TP didn't roll out 'til 1983, thirteen years into the
language's existence.
I don't think anyone used the original flavor of the language.
The ISO standard wasn't finalized 'til 1983, the same year as TP; even
UCSD Pascal didn't come around 'til 1977. But it was being used for
teaching well before that, and Kernighan's essay was published in '81,
so people were most definitely using (or trying to use) earlier forms
of the language for stuff.
-a I used the M$/IBM multi-pass Pascal compiler (still
-a have it in a VM) I *think* that came out maybe a
-a year before TP.
-a Remember seeing a little ad in a magazine for TP.
-a The price was good, the claims seemed impressive.
-a So, I bought it. NOT disappointed. Made development
-a unbelievably quicker/easier. Had to wait until v3
-a to get good graphics though. Even found a good use
-a for the 'turtle'.
On Tue, 6 Jan 2026 13:19:54 +0100, Carlos E.R. wrote:
I don't think anyone used the original flavor of the language.
They did. Or at least they tried to.
cross@spitfire.i.gajendra.net (Dan Cross) writes:
In article <84c7R.819121$PGrb.160843@fx10.iad>,
Scott Lurndal <slp53@pacbell.net> wrote:
cross@spitfire.i.gajendra.net (Dan Cross) writes:
In article <10jjc9s$3uhtk$1@dont-email.me>,
Chris Ahlstrom <OFeem1987@teleworm.us> wrote:
Waldek Hebisch wrote this post by blinking in Morse code:
In alt.folklore.computers c186282 <c186282@nnada.net> wrote:
<snip>
Hmm ... look at all the GNU 'compilers' -
FORTRAN, COBOL, Ada, 'D', M2, Rust,C++,
G++, even Algol-68. None are 'compilers'
per-se, but to-'C' TRANSLATORS. So, 'C',
pretty much All Are One And One Is All.
No. Compiler as first stage translate given language to a
common representation. This representatiton is different
than C. Ada and GNU Pascal have parametrized types, there
is nothing like that in C.
<interjection>
C++ (and some other languages)
have exceptions, C do not have them.
What about setjmp()/longjmp() ?
Not at all the same thing. `setjmp`/`longjmp` are about
non-local flows of control; exceptions are about non-local
passing of values.
However, in many real world situations, [sig]setjump and
[sig]longjmp can be used to emulate exceptions.
Yes, I said just that. :-)
I have a C++ application that models a computer (Burroughs V380
et alia). The thread that models each processor (cpu) uses
longjmp whenever a condition is encountered that would have
been signaled as a fault on the real cpu. The processor code
doesn't do dynamic memory allocation; and the fault code is
stored in the processor class before the longjmp call.
I once tried replacing setjmp/longjmp with C++ exceptions which
led to a 20% reduction in simulated CPU performance (as measured
by the time to compile a COBOL program).
Huh. Interesting. I wonder why...possibly to run a bunch of
nop destructors?
A large component of the overhead was the code generated in every
function to handle unwinding during exception processing.
When
using setjmp/longjmp, I compiled with the following options so
it wouldn't generate the unwind code:
GXXFLAGS = -mno-red-zone
GXXFLAGS += -fno-strict-aliasing
GXXFLAGS += -fno-stack-protector
GXXFLAGS += -fno-exceptions
GXXFLAGS += -Wall
GXXFLAGS += -mtune=native
A friend once wrote an 8080 cross-assembler in COBOL.
It ran rings around Univac's official cross-assembler -
which was written in FORTRAN.
On Tue, 6 Jan 2026 07:42:36 -0700, Peter Flass wrote:
I think Brinch-Hansen used Modula-2.
DidnrCOt he create his own language, called rCLEdisonrCY?
On 1/6/26 01:28, rbowman wrote:
On Tue, 6 Jan 2026 04:10:59 -0000 (UTC), Lawrence DrCOOliveiro wrote:
Some degree of UCSD Pascal compatibility was very common among
microcomputer-based implementations.
I remember that being referred to as 'scud pascal'. Dyslexic programmers?
-a Heh, maybe-a :-)
-a But you CAN see why.
-a I think the idea was to make a 'generic' interpreted
-a Pascal that could be run on many different kinds of
-a machines. BASIC was widespread, but kinda ugly, and
-a 'C' was too cryptic.
-a The modern UCSD 'Pascal' wound up being Python.
-a I'd forgotten ... p-System was the "3rd OS" offered for
-a the original IBM-PC. Alas it was over-priced and under-
-a performing, so ....
On 1/6/26 07:16, Waldek Hebisch wrote:
In alt.folklore.computers c186282 <c186282@nnada.net> wrote:
<snip>
-a-a Hmm ... look at all the GNU 'compilers' -
-a-a FORTRAN, COBOL, Ada, 'D', M2, Rust,C++,
-a-a G++, even Algol-68. None are 'compilers'
-a-a per-se, but to-'C' TRANSLATORS. So, 'C',
-a-a pretty much All Are One And One Is All.
No.-a Compiler as first stage translate given language to a
common representation.-a This representatiton is different
than C.-a Ada and GNU Pascal have parametrized types, there
is nothing like that in C.-a C++ (and some other languages)
have exceptions, C do not have them.-a There are several
smaller things, for example Ada or Pascal modulo is different
that C/Fortran modulo.-a During optimization passes gcc
keeps such information, to allow better optimization and
error reporting.
There were/are compilers that work by translating to C.-a But
this has limitations: generated code typically is worse because
language specific information is lost in translation.-a Error
reporting is worse because translator is not doing as many
analyzes as gcc do.-a For those reasons compilers in gcc
generate common representation which contains sum of features
of all supported languages and not C.
-a You give it a file in whatever lang, it produces
-a a file in 'C' and compiles that. So, I'll basically
-a stick with my 'translator' def. And if 'C' does not
-a 'natively support' something you can FAKE it with code,
-a not really anything you CAN'T do with 'C'.
-a By 'compiler' I mean "source in -> (agitating sounds) ->
-a binary executable out.
-a I think there are still a few FORTRAN compilers out
-a there for Linux, maybe COBOL too. There's at least
-a one forth IDE/compiler. Digital Mars makes 'C' and
-a 'D' compilers. GCC is not the alpha and omega
-a of software development.
-a-a But it CAN be much more friendly and/or
-a-a tuned to a particular area of interest
-a-a or preferred programming style.
On 2026-01-06 22:30, c186282 wrote:
On 1/6/26 11:30, John Ames wrote:
On Tue, 6 Jan 2026 13:19:54 +0100
"Carlos E.R." <robin_listas@es.invalid> wrote:
Turbo Pascal had [...]
Sure did! But TP didn't roll out 'til 1983, thirteen years into the
language's existence.
I don't think anyone used the original flavor of the language.
The ISO standard wasn't finalized 'til 1983, the same year as TP; even
UCSD Pascal didn't come around 'til 1977. But it was being used for
teaching well before that, and Kernighan's essay was published in '81,
so people were most definitely using (or trying to use) earlier forms
of the language for stuff.
-a-a I used the M$/IBM multi-pass Pascal compiler (still
-a-a have it in a VM) I *think* that came out maybe a
-a-a year before TP.
-a-a Remember seeing a little ad in a magazine for TP.
-a-a The price was good, the claims seemed impressive.
-a-a So, I bought it. NOT disappointed. Made development
-a-a unbelievably quicker/easier. Had to wait until v3
-a-a to get good graphics though. Even found a good use
-a-a for the 'turtle'.
I remember trying both compilers. The M$ variant was unbelievable slow.
On 1/6/26 14:24, c186282 wrote:
On 1/6/26 07:16, Waldek Hebisch wrote:
In alt.folklore.computers c186282 <c186282@nnada.net> wrote:
<snip>
-a-a Hmm ... look at all the GNU 'compilers' -
-a-a FORTRAN, COBOL, Ada, 'D', M2, Rust,C++,
-a-a G++, even Algol-68. None are 'compilers'
-a-a per-se, but to-'C' TRANSLATORS. So, 'C',
-a-a pretty much All Are One And One Is All.
No.-a Compiler as first stage translate given language to a
common representation.-a This representatiton is different
than C.-a Ada and GNU Pascal have parametrized types, there
is nothing like that in C.-a C++ (and some other languages)
have exceptions, C do not have them.-a There are several
smaller things, for example Ada or Pascal modulo is different
that C/Fortran modulo.-a During optimization passes gcc
keeps such information, to allow better optimization and
error reporting.
There were/are compilers that work by translating to C.-a But
this has limitations: generated code typically is worse because
language specific information is lost in translation.-a Error
reporting is worse because translator is not doing as many
analyzes as gcc do.-a For those reasons compilers in gcc
generate common representation which contains sum of features
of all supported languages and not C.
-a You give it a file in whatever lang, it produces
-a a file in 'C' and compiles that. So, I'll basically
-a stick with my 'translator' def. And if 'C' does not
-a 'natively support' something you can FAKE it with code,
-a not really anything you CAN'T do with 'C'.
-a By 'compiler' I mean "source in -> (agitating sounds) ->
-a binary executable out.
-a I think there are still a few FORTRAN compilers out
-a there for Linux, maybe COBOL too. There's at least
-a one forth IDE/compiler. Digital Mars makes 'C' and
-a 'D' compilers. GCC is not the alpha and omega
-a of software development.
-a-a But it CAN be much more friendly and/or
-a-a tuned to a particular area of interest
-a-a or preferred programming style.
Iron Spring PL/I compiles directly to binary. It can produce assembler >output, but only as a by-product of generating the object file. I have >occasionally thought of trying to make it another front-end for GCC. As
I understand it, GCC compiles to an intermediate language, not to C.
On 2026-01-06, Waldek Hebisch <antispam@fricas.org> wrote:
In alt.folklore.computers c186282 <c186282@nnada.net> wrote:
On 1/6/26 07:16, Waldek Hebisch wrote:
In alt.folklore.computers c186282 <c186282@nnada.net> wrote:
<snip>
Hmm ... look at all the GNU 'compilers' -
FORTRAN, COBOL, Ada, 'D', M2, Rust,C++,
G++, even Algol-68. None are 'compilers'
per-se, but to-'C' TRANSLATORS. So, 'C',
pretty much All Are One And One Is All.
No. Compiler as first stage translate given language to a
common representation. This representatiton is different
than C. Ada and GNU Pascal have parametrized types, there
is nothing like that in C. C++ (and some other languages)
have exceptions, C do not have them. There are several
smaller things, for example Ada or Pascal modulo is different
that C/Fortran modulo. During optimization passes gcc
keeps such information, to allow better optimization and
error reporting.
There were/are compilers that work by translating to C. But
this has limitations: generated code typically is worse because
language specific information is lost in translation. Error
reporting is worse because translator is not doing as many
analyzes as gcc do. For those reasons compilers in gcc
generate common representation which contains sum of features
of all supported languages and not C.
You give it a file in whatever lang, it produces
a file in 'C' and compiles that.
No, if you looked at what compilers in gcc are doing you
will see that there are no intemediate C file. There
is intermediate assembler, but between source file and
assembler each compiler work independently
Still, Bjarne Stroustrup's first implementation of C++
was a program called cfront, which translated C++ to C.
In article <fzf7R.805815$i%aa.272881@fx12.iad>,
Scott Lurndal <slp53@pacbell.net> wrote:
cross@spitfire.i.gajendra.net (Dan Cross) writes:
In article <84c7R.819121$PGrb.160843@fx10.iad>,
Scott Lurndal <slp53@pacbell.net> wrote:
cross@spitfire.i.gajendra.net (Dan Cross) writes:
In article <10jjc9s$3uhtk$1@dont-email.me>,
Chris Ahlstrom <OFeem1987@teleworm.us> wrote:
Waldek Hebisch wrote this post by blinking in Morse code:
In alt.folklore.computers c186282 <c186282@nnada.net> wrote:
<snip>
Hmm ... look at all the GNU 'compilers' -
FORTRAN, COBOL, Ada, 'D', M2, Rust,C++,
G++, even Algol-68. None are 'compilers'
per-se, but to-'C' TRANSLATORS. So, 'C',
pretty much All Are One And One Is All.
No. Compiler as first stage translate given language to a
common representation. This representatiton is different
than C. Ada and GNU Pascal have parametrized types, there
is nothing like that in C.
<interjection>
C++ (and some other languages)
have exceptions, C do not have them.
What about setjmp()/longjmp() ?
Not at all the same thing. `setjmp`/`longjmp` are about
non-local flows of control; exceptions are about non-local
passing of values.
However, in many real world situations, [sig]setjump and
[sig]longjmp can be used to emulate exceptions.
Yes, I said just that. :-)
I have a C++ application that models a computer (Burroughs V380
et alia). The thread that models each processor (cpu) uses
longjmp whenever a condition is encountered that would have
been signaled as a fault on the real cpu. The processor code
doesn't do dynamic memory allocation; and the fault code is
stored in the processor class before the longjmp call.
I once tried replacing setjmp/longjmp with C++ exceptions which
led to a 20% reduction in simulated CPU performance (as measured
by the time to compile a COBOL program).
Huh. Interesting. I wonder why...possibly to run a bunch of
nop destructors?
A large component of the overhead was the code generated in every
function to handle unwinding during exception processing.
That makes sense; thanks.
When
using setjmp/longjmp, I compiled with the following options so
it wouldn't generate the unwind code:
GXXFLAGS = -mno-red-zone
GXXFLAGS += -fno-strict-aliasing
GXXFLAGS += -fno-stack-protector
GXXFLAGS += -fno-exceptions
GXXFLAGS += -Wall
GXXFLAGS += -mtune=native
Most of those seem irrelevant to generating extra code for stack
unwinding.
Sorry, but THIS is how I see it all going, soon.
The whole research/commercial/regulatory universe is 101% for AI and
nothing BUT the AI.
I wouldn't be surprised if non-AI-Slave PCs are either deliberately
sabotaged or made illegal. This is Giant Money, Giant Power.
https://arstechnica.com/gadgets/2026/01/dells-xps-revival-is-a-welcome- reprieve-from-the-ai-pc-fad/
Does Dell see a little gnome with a pin approaching the bubble?
Seems plausible - may also have to do with phone-tree systems and
how intelligible "hash" is or isn't over a muffled line, vs. a word
that begins and ends with hard consonants.
I hadn't thought of that angle. Indeed, aeronautical radio
phraseology has evolved to deal with just that sort of problem.
On Wed, 07 Jan 2026 06:33:45 GMT
Charlie Gibbs <cgibbs@kltpzyxm.invalid> wrote:
Seems plausible - may also have to do with phone-tree systems and
how intelligible "hash" is or isn't over a muffled line, vs. a word
that begins and ends with hard consonants.
I hadn't thought of that angle. Indeed, aeronautical radio
phraseology has evolved to deal with just that sort of problem.
Many's the time I've had to resort to the NATO phonetic alphabet when
trying to get a customer to type something in over the phone.
On 7 Jan 2026 02:00:12 GMT
rbowman <bowman@montana.com> wrote:
https://arstechnica.com/gadgets/2026/01/dells-xps-revival-is-a-welcome-
reprieve-from-the-ai-pc-fad/
Does Dell see a little gnome with a pin approaching the bubble?
Shockingly, it turns out that businesses do better when they make and
sell things that people actually *want* o_O
I think that AI could be used on a proper computer system to do all
the little annoying things that people as ignorant as me have to ask
experts about. Backups, defragmenting routines, checking for updates, changing ownership on disks and volumes and applying patches but my
model of AI would be running only on one's computer and be active
when the processor(s) have enough free cycles to be useful.
These bizarre definitional assertions about what makes something
a "compiler" or not seem to be mostly put forth by people who
have never heard of the concept of "separate compilation" or
"libraries", let alone touched the innards of a compiler. In
particular, this idea that everything must be implemented in a
single program or it's not a "true" compiler is rooted firmly in
ignorance.
On the other hand, a compiler that uses another compiled language
as intermediate code is a strange beast, probably better called a translator.
Leave it to M$ (and IBM) to screw it up. Pascal was specifically
designed for fast one-pass compilation.
These bizarre definitional assertions about what makes something
a "compiler" or not seem to be mostly put forth by people who
have never heard of the concept of "separate compilation" or
"libraries", let alone touched the innards of a compiler. In
particular, this idea that everything must be implemented in a
single program or it's not a "true" compiler is rooted firmly in
ignorance.
On 06/01/2026 18:57, Charlie Gibbs wrote:
remember theIIRC the are not , strictly, the same thing...
"baud" vs "bps" confusion.
... I appreciate C# as a language, but I think quite a lot of
software would be better off (read: could be constructed faster) if
not written written in C# / .NET but in Java and it's ecosystem.
On 07/01/2026 14:47, Peter Flass wrote:
Leave it to M$ (and IBM) to screw it up. Pascal was specifically
designed for fast one-pass compilation.
No., it wasn't. It was designed as a teaching language. Borland hacked
it about and made it a hacker paradise with as quick 'write/run' times
as BASIC
On the other hand, a compiler that uses another compiled language as intermediate code is a strange beast, probably better called a
translator.
But trying to parse free-form text and do macro expansions with
string substitutions ... disaster in COBOL. Hard enough in FORTRAN.
Many's the time I've had to resort to the NATO phonetic alphabet when
trying to get a customer to type something in over the phone.
Clearly the GCC collection is More Complicated than I thought.
But I'm still not sure I'll call them 'compilers'
in the older sense of the word. Some intermediate term is required.
Charlie Gibbs <cgibbs@kltpzyxm.invalid> writes:
On 2026-01-06, Waldek Hebisch <antispam@fricas.org> wrote:
In alt.folklore.computers c186282 <c186282@nnada.net> wrote:
On 1/6/26 07:16, Waldek Hebisch wrote:
In alt.folklore.computers c186282 <c186282@nnada.net> wrote: <snip> >>>>>> Hmm ... look at all the GNU 'compilers' -
FORTRAN, COBOL, Ada, 'D', M2, Rust,C++, G++, even Algol-68. None >>>>>> are 'compilers' per-se, but to-'C' TRANSLATORS. So, 'C', pretty >>>>>> much All Are One And One Is All.
No. Compiler as first stage translate given language to a common
representation. This representatiton is different than C. Ada and
GNU Pascal have parametrized types, there is nothing like that in C. >>>>> C++ (and some other languages) have exceptions, C do not have them. >>>>> There are several smaller things, for example Ada or Pascal modulo
is different that C/Fortran modulo. During optimization passes gcc
keeps such information, to allow better optimization and error
reporting.
There were/are compilers that work by translating to C. But this
has limitations: generated code typically is worse because language
specific information is lost in translation. Error reporting is
worse because translator is not doing as many analyzes as gcc do.
For those reasons compilers in gcc generate common representation
which contains sum of features of all supported languages and not C.
You give it a file in whatever lang, it produces a file in 'C' and
compiles that.
No, if you looked at what compilers in gcc are doing you will see that
there are no intemediate C file. There is intermediate assembler, but
between source file and assembler each compiler work independently
Still, Bjarne Stroustrup's first implementation of C++ was a program
called cfront, which translated C++ to C.
Rather ugly C, at that. I had to fix a bug in PCC[*] caused by the excessive use of the comma operator in the cfront generated C code.
On 1/7/26 08:13, Dan Cross wrote:
I think compilers have generated intermediate code since the first
These bizarre definitional assertions about what makes something a
"compiler" or not seem to be mostly put forth by people who have never
heard of the concept of "separate compilation" or "libraries", let
alone touched the innards of a compiler. In particular, this idea that
everything must be implemented in a single program or it's not a "true"
compiler is rooted firmly in ignorance.
FORTRAN compiler. The only distinction is one vs. multiple programs. Wth
a variety of both front- and back-ends GCC has good reason to separate
them. On the other hand, a compiler that uses another compiled language
as intermediate code is a strange beast, probably better called a
translator.
It's ironic watching the industry change from centralized systems in the
'60s and '70s (due to the high cost of electronics) to distributed
systems starting in the '80s,
only to have it come full circle now. The difference is that rather
than cost, the driving factor is centralized control.
On 2026-01-06 17:30, John Ames wrote:
On Tue, 6 Jan 2026 13:19:54 +0100 "Carlos E.R."Ah. I did not meet it till about the time of TP 2.
<robin_listas@es.invalid> wrote:
Turbo Pascal had [...]
Sure did! But TP didn't roll out 'til 1983, thirteen years into the
language's existence.
I don't think anyone used the original flavor of the language.
The ISO standard wasn't finalized 'til 1983, the same year as TP; even
UCSD Pascal didn't come around 'til 1977. But it was being used for
teaching well before that, and Kernighan's essay was published in '81,
so people were most definitely using (or trying to use) earlier forms
of the language for stuff.
On 06/01/2026 21:06, c186282 wrote:
On 1/6/26 05:10, The Natural Philosopher wrote:Some assembler is...it's a choice. Especially Macro assembler...
On 06/01/2026 03:27, Peter Flass wrote:
On 1/5/26 12:50, John Ames wrote:
On Mon, 5 Jan 2026 12:33:53 -0700 Peter FlassI seem to recall reading that someone once wrote an OS in COBOL.
<Peter@Iron-Spring.com> wrote:
Actually, many systems programming languages have no I/O, the idea >>>>>> being that non-OS programs call the OS to do the I/O, and the OS
interacts directly with the hardware.
"Systems programming" usually implies implementation of an OS,
though,
and IIRC that was the sense that Kernighan was using. You can't
excuse limitations by "oh, the OS handles that" when your program
*is* the OS.*
* (Obviously, there's a certain point in any HLL where Deep Magic
has
-a-a to handle interfacing between language constructs and bare
-a-a metal,
but
-a-a the higher up the "threshold of minimum abstraction" is, the
-a-a less suitable it is for systems programming in the first place. >>>>> -a-a Of course, there's also the problem where seemingly *any*
-a-a language that's not designed for systems programming will
-a-a ultimately get pressed into service for systems programming-a
-a-a *somewhere...*)
-aFrom what little I know COBOL looked very like assembler.
-a If assembler was RIDICULOUSLY WORDY-a :-)
On 1/7/26 08:57, John Ames wrote:welcome-
On 7 Jan 2026 02:00:12 GMT rbowman <bowman@montana.com> wrote:
https://arstechnica.com/gadgets/2026/01/dells-xps-revival-is-a-
will missAmazing but still I bought several (used) Latitudes andreprieve-from-the-ai-pc-fad/
Does Dell see a little gnome with a pin approaching the bubble?
Shockingly, it turns out that businesses do better when they make and
sell things that people actually *want* o_O
beingthen
able to shop for those if I ever have enough cash for that sort of
thing. I am glad I go my Precision when I did. I think that AI could
be used on a proper computer system to do all the little annoying things
that people as ignorant as me have to ask experts about. Backups, defragmenting routines, checking for updates, changing ownership on
disks and volumes and applying patches but my model of AI would be
running only on one's computer and be active when the processor(s) have enough free cycles to be useful.
Now when they get that sort of tool if I am alive and in funds
an AI computer might be halfway interesting.intelligence
After all I have spent nearly 88 years developing my own
and it seems to work very well for my purposes. (Some may disagree!)
On 2026-01-06 19:57, Charlie Gibbs wrote:
On 2026-01-06, Lars Poulsen <lars@beagle-ears.com> wrote:
On 2026-01-06, Carlos E.R. <robin_listas@es.invalid> wrote:
My C teacher said it was a mistake to use C as an all purpose
language, like for userland applications. Using C is the cause of
many bugs that a proper language would catch.
That was around 1991.
He knew. He participated in some study tasked by the Canadian
government to study C compilers, but he could not talk about what
they wrote.
What language(s) did he suggest instead?
I don't remember if he did. Maybe he told samples, but I think he mostly
told us of quirks of the language, things that were errors, but that the compiler did not signal, so that we being aware we would write correct C code.
It is possible that current C compilers signal many more problems that
back then, but not runtime errors.
On 2026-01-07, c186282 <c186282@nnada.net> wrote:
On 1/6/26 17:22, John Ames wrote:
On Tue, 6 Jan 2026 16:04:17 -0500
c186282 <c186282@nnada.net> wrote:
I'd forgotten ... p-System was the "3rd OS" offered for the original
IBM-PC. Alas it was over-priced and under- performing, so ....
Yeah - a forgotten entry in the saga of write-once-run-anywhere dreams,
right up there with Java workstations...
Well, I'm glad people THINK of such things ... alas
all attempts have been for naught. 'Generic solutions'
require too many compromises.
The TRUE 'All-Everything System' will be the AIs.
This may NOT be such a great thing, but with the
TRILLIONS invested it's GOING to be The Thing.
'Thin' clients plugged only into the Higher
Intelligence.
It's ironic watching the industry change from centralized
systems in the '60s and '70s (due to the high cost of
electronics) to distributed systems starting in the '80s,
only to have it come full circle now. The difference is
that rather than cost, the driving factor is centralized
control.
Unaccountable People You Don't Know will be in charge
of tasking and biasing the Higher Intelligence for
awhile - then it'll start taking care of itself.
Wait, watch, see.
Fasten your seatbelts, folks.
On Tue, 6 Jan 2026 21:00:25 -0500, c186282 wrote:
Clearly the GCC collection is More Complicated than I thought.
Back when dinosaurs roamed the earth it was the GNU C Compiler. Then it learned new tricks.
https://en.wikipedia.org/wiki/Register_transfer_language
But I'm still not sure I'll call them 'compilers'
in the older sense of the word. Some intermediate term is required.
That would be RTL. Microsoft's CIL is similar but depends on a runtime. CLang//LLVM is another approach which overlaps GCC. fwiw I have both on
this box.
IRs have been used for a long, long time.
https://dl.acm.org/doi/epdf/10.1145/2480741.2480743
Some light reading:
https://archive.org/details/principlesofcomp0000ahoa/mode/2up
On Wed, 7 Jan 2026 14:11:10 -0000 (UTC), Lars Poulsen wrote:
But trying to parse free-form text and do macro expansions with
string substitutions ... disaster in COBOL. Hard enough in FORTRAN.
Hard to see the point in an assembler without such features, though.
Colleges don't always make great choices and do their students a
disservice. At one time University of Montana used Modula-2, another Wirth production. Later they chose Java after being offered financial incentives
by Sun. (I think it was before Oracle). Arguably a better choice although
it didn't do much when we were looking for C/C++ programmers.
I did a COBOL program to do string substitutions. The idea was that it
read a COBOL program written possibly by a blind hacker and substituted variable names with longer, standardized ones.
On Wed, 7 Jan 2026 14:11:10 -0000 (UTC), Lars Poulsen wrote:
But trying to parse free-form text and do macro expansions with string
substitutions ... disaster in COBOL. Hard enough in FORTRAN.
Hard to see the point in an assembler without such features, though.
FORTRAN and COBOL are still around, but I don't thinks anyone from
the 70s would recognize them.
On Wed, 7 Jan 2026 19:21:09 -0700, Peter Flass wrote:
FORTRAN and COBOL are still around, but I don't thinks anyone from
the 70s would recognize them.
COBOL is still COBOL. Fortran has evolved somewhat, post-Fortran-77.
Leave it to M$ (and IBM) to screw it up. Pascal was specifically
designed for fast one-pass compilation.
On Wed, 7 Jan 2026 09:48:29 +0000, The Natural Philosopher wrote:
On 06/01/2026 18:57, Charlie Gibbs wrote:
remember the "baud" vs "bps" confusion.
IIRC the are not , strictly, the same thing...
They usually were back in the 1200 baud days. Then things got
complicated.
On Tue, 6 Jan 2026 22:37:40 -0500
c186282 <c186282@nnada.net> wrote:
Sorry, but THIS is how I see it all going, soon.
The whole research/commercial/regulatory universe is 101% for AI and
nothing BUT the AI.
I wouldn't be surprised if non-AI-Slave PCs are either deliberately
sabotaged or made illegal. This is Giant Money, Giant Power.
Doesn't matter how much money they throw at it - what they're selling
will never do half of what they're claiming, and they're singularly un- interested in researching anything else. The VC firehose is already
starting to dribble; it's taken *entirely* too long, but investors have finally begun to look at the "burn infinite money on things that don't
work -> ??? -> profit...?" plan
and go "wait, maybe we *don't* want to
do that?" Ed Zitron's been writing about this for a couple years now,
and just covered that recently:
https://www.wheresyoured.at/the-enshittifinancial-crisis/#blue-owl-in-a-coal-mine
It's been infuriating but also hilarious to watch this much money flail blindly for this long at things the people backing it plainly have no understanding of, simply because a handful of grifters/con-men suckered
them in with the promise of "you'll *totally* be able to fire everyone
and replace them with chatbots Real Soon Now."
It's gonna be a global financial disaster when the bubble finally goes, mind you, but there is
a certain black comedy to it.
On Wed, 07 Jan 2026 06:33:50 GMT, Charlie Gibbs wrote:
It's ironic watching the industry change from centralized systems in the
'60s and '70s (due to the high cost of electronics) to distributed
systems starting in the '80s,
only to have it come full circle now. The difference is that rather
than cost, the driving factor is centralized control.
The game has changed a bit as anyone who suffered through a time-sharing system will affirm. Nothing like trying to trying to run a cross assembler on a VAX when accounting is doing the payroll.
On 7 Jan 2026 02:00:12 GMT
rbowman <bowman@montana.com> wrote:
https://arstechnica.com/gadgets/2026/01/dells-xps-revival-is-a-welcome-
reprieve-from-the-ai-pc-fad/
Does Dell see a little gnome with a pin approaching the bubble?
Shockingly, it turns out that businesses do better when they make and
sell things that people actually *want* o_O
On 06/01/2026 21:06, c186282 wrote:
On 1/6/26 05:10, The Natural Philosopher wrote:Some assembler is...it's a choice. Especially Macro assembler...
On 06/01/2026 03:27, Peter Flass wrote:
I seem to recall reading that someone once wrote an OS in COBOL.
-aFrom what little I know COBOL looked very like assembler.
-a If assembler was RIDICULOUSLY WORDY-a :-)
On 2026-01-07, Peter Flass <Peter@Iron-Spring.com> wrote:
Leave it to M$ (and IBM) to screw it up. Pascal was specifically
designed for fast one-pass compilation.
Is that why people wrote programs bottom-up
(i.e. with the main function at the bottom
to avoid forward references)?
On 2026-01-07, The Natural Philosopher <tnp@invalid.invalid> wrote:
On 06/01/2026 21:06, c186282 wrote:
On 1/6/26 05:10, The Natural Philosopher wrote:Some assembler is...it's a choice. Especially Macro assembler...
On 06/01/2026 03:27, Peter Flass wrote:
I seem to recall reading that someone once wrote an OS in COBOL.
-aFrom what little I know COBOL looked very like assembler.
-a If assembler was RIDICULOUSLY WORDY-a :-)
I remember CS weenies fawning over a language called pl360, the
misbegotten bastard child of Algol and 360 assembly language. :-p
I hated Wirthian languages from the start and still do.
Just bad chemistry, I guess. But I couldn't stand having
some snooty compiler slap my wrist and tell me that I
couldn't do what I could do in a couple of lines of
assembly language.
Our CS department had Algol 60, Algol 68, and Algol W.
I never did succeed in getting a program to run.
On 2026-01-07, John Ames <commodorejohn@gmail.com> wrote:
On 7 Jan 2026 02:00:12 GMT
rbowman <bowman@montana.com> wrote:
https://arstechnica.com/gadgets/2026/01/dells-xps-revival-is-a-welcome-
reprieve-from-the-ai-pc-fad/
Does Dell see a little gnome with a pin approaching the bubble?
Shockingly, it turns out that businesses do better when they make and
sell things that people actually *want* o_O
The smart ones try to control what people want.
On Wed, 7 Jan 2026 13:30:14 +0100, Carlos E.R. wrote:
On 2026-01-06 19:57, Charlie Gibbs wrote:
On 2026-01-06, Lars Poulsen <lars@beagle-ears.com> wrote:
On 2026-01-06, Carlos E.R. <robin_listas@es.invalid> wrote:
My C teacher said it was a mistake to use C as an all purpose
language, like for userland applications. Using C is the cause of
many bugs that a proper language would catch.
That was around 1991.
He knew. He participated in some study tasked by the Canadian
government to study C compilers, but he could not talk about what
they wrote.
What language(s) did he suggest instead?
I don't remember if he did. Maybe he told samples, but I think he mostly
told us of quirks of the language, things that were errors, but that the
compiler did not signal, so that we being aware we would write correct C
code.
It is possible that current C compilers signal many more problems that
back then, but not runtime errors.
gcc has become pickier. That isn't always a welcome thing when working
with legacy code and requires a search of the compiler options to get it
to shut up about such horrible heresies as assuming a function returns an int.
On 2026-01-07, Peter Flass <Peter@Iron-Spring.com> wrote:
Leave it to M$ (and IBM) to screw it up. Pascal was specifically
designed for fast one-pass compilation.
Is that why people wrote programs bottom-up (i.e. with the main
function at the bottom to avoid forward references)?
The smart ones try to control what people want.
On Tue, 6 Jan 2026 22:37:40 -0500
c186282 <c186282@nnada.net> wrote:
Sorry, but THIS is how I see it all going, soon.
The whole research/commercial/regulatory universe is 101% for AI and
nothing BUT the AI.
I wouldn't be surprised if non-AI-Slave PCs are either deliberately
sabotaged or made illegal. This is Giant Money, Giant Power.
Doesn't matter how much money they throw at it - what they're selling
will never do half of what they're claiming, and they're singularly un- interested in researching anything else. The VC firehose is already
starting to dribble; it's taken *entirely* too long, but investors have finally begun to look at the "burn infinite money on things that don't
work -> ??? -> profit...?" plan and go "wait, maybe we *don't* want to
do that?" Ed Zitron's been writing about this for a couple years now,
and just covered that recently:
https://www.wheresyoured.at/the-enshittifinancial-crisis/#blue-owl-in-a-coal-mine
It's been infuriating but also hilarious to watch this much money flail blindly for this long at things the people backing it plainly have no understanding of, simply because a handful of grifters/con-men suckered
them in with the promise of "you'll *totally* be able to fire everyone
and replace them with chatbots Real Soon Now." It's gonna be a global financial disaster when the bubble finally goes, mind you, but there is
a certain black comedy to it.
On 1/7/26 21:26, Lawrence DrCOOliveiro wrote:
On Wed, 7 Jan 2026 19:21:09 -0700, Peter Flass wrote:
FORTRAN and COBOL are still around, but I don't thinks anyone from
the 70s would recognize them.
COBOL is still COBOL. Fortran has evolved somewhat, post-Fortran-77.
-a Yea, COBOL kind is kind of frozen in time now.
-a However that might not be a BAD thing ...
On Wed, 7 Jan 2026 09:56:20 +0000, The Natural Philosopher wrote:
On 06/01/2026 21:06, c186282 wrote:
Some assembler is...it's a choice. Especially Macro assembler...
-a If assembler was RIDICULOUSLY WORDY-a :-)
I remember a strange attempt to do Win32 API programming in 'assembler'.
The author more or less reinvented C using MASM.
On Wed, 7 Jan 2026 13:30:14 +0100, Carlos E.R. wrote:
On 2026-01-06 19:57, Charlie Gibbs wrote:
On 2026-01-06, Lars Poulsen <lars@beagle-ears.com> wrote:
On 2026-01-06, Carlos E.R. <robin_listas@es.invalid> wrote:
My C teacher said it was a mistake to use C as an all purpose
language, like for userland applications. Using C is the cause of
many bugs that a proper language would catch.
That was around 1991.
He knew. He participated in some study tasked by the Canadian
government to study C compilers, but he could not talk about what
they wrote.
What language(s) did he suggest instead?
I don't remember if he did. Maybe he told samples, but I think he mostly
told us of quirks of the language, things that were errors, but that the
compiler did not signal, so that we being aware we would write correct C
code.
It is possible that current C compilers signal many more problems that
back then, but not runtime errors.
gcc has become pickier. That isn't always a welcome thing when working
with legacy code and requires a search of the compiler options to get it
to shut up about such horrible heresies as assuming a function returns an int.
I hated Wirthian languages from the start and still do.
Just bad chemistry, I guess. But I couldn't stand having
some snooty compiler slap my wrist and tell me that I
couldn't do what I could do in a couple of lines of
assembly language.
On 2026-01-07, John Ames <commodorejohn@gmail.com> wrote:
On 7 Jan 2026 02:00:12 GMT
rbowman <bowman@montana.com> wrote:
https://arstechnica.com/gadgets/2026/01/dells-xps-revival-is-a-welcome-
reprieve-from-the-ai-pc-fad/
Does Dell see a little gnome with a pin approaching the bubble?
Shockingly, it turns out that businesses do better when they make and
sell things that people actually *want* o_O
The smart ones try to control what people want.
John Ames wrote this post by blinking in Morse code:
On Wed, 07 Jan 2026 06:33:45 GMT
Charlie Gibbs <cgibbs@kltpzyxm.invalid> wrote:
Seems plausible - may also have to do with phone-tree systems and
how intelligible "hash" is or isn't over a muffled line, vs. a word
that begins and ends with hard consonants.
I hadn't thought of that angle. Indeed, aeronautical radio
phraseology has evolved to deal with just that sort of problem.
Many's the time I've had to resort to the NATO phonetic alphabet when
trying to get a customer to type something in over the phone.
Like "It all went tango uniform"? A real "charlie foxtrot"?
On Wed, 7 Jan 2026 19:16:03 -0700, Peter Flass wrote:
I did a COBOL program to do string substitutions. The idea was that it
read a COBOL program written possibly by a blind hacker and substituted
variable names with longer, standardized ones.
Did it understand the rules of IN-scoping?
On Wed, 7 Jan 2026 19:21:09 -0700, Peter Flass wrote:
FORTRAN and COBOL are still around, but I don't thinks anyone from
the 70s would recognize them.
COBOL is still COBOL. Fortran has evolved somewhat, post-Fortran-77.
On Wed, 7 Jan 2026 19:21:09 -0700, Peter Flass wrote:
FORTRAN and COBOL are still around, but I don't thinks anyone from
the 70s would recognize them.
COBOL is still COBOL. Fortran has evolved somewhat, post-Fortran-77.
On 2026-01-07, Peter Flass <Peter@Iron-Spring.com> wrote:
Leave it to M$ (and IBM) to screw it up. Pascal was specifically
designed for fast one-pass compilation.
Is that why people wrote programs bottom-up
(i.e. with the main function at the bottom
to avoid forward references)?
C had forward declarations from early on, but they were somewhat jankyIs that why people wrote programs bottom-up (i.e. with the main
function at the bottom to avoid forward references)?
C is also like that. And C++, for all its enormous complexity in other
areas, preserves the tradition.
Huge amounts of perfectly useable technology are 'frozen in time'
My coffee beaker is no different in principle from a bronze age
beaker.
Round wheels predate the Ark...
Oh, it's possible-ish, for a time, in the right social context; advert-The smart ones try to control what people want.
The smart ones know thatrCOs impossible. The best they can do is entice
the punters with attractive alternatives, and leave them to make the
choice.
On Thu, 8 Jan 2026 07:01:19 -0000 (UTC)
Lawrence DrCOOliveiro <ldo@nz.invalid> wrote:
The smart ones try to control what people want.
The smart ones know thatrCOs impossible. The best they can do is entice
the punters with attractive alternatives, and leave them to make the
choice.
Oh, it's possible-ish, for a time, in the right social context; advert-
ising is essentially weaponized mass psychology at this point, and they
got *very* good at it for a while there.
The interesting thing is that,
as so much of the corporate space is dominated by absolute morons with
no connection to the line of business these days, a lot of the major
players are being stupid enough that even the ad people can't sell it
to the masses.
Like, that Dell laptop - on top of "AI-ready" being not
a thing *anyone* needs (to the extent that it's even a *thing* at all
and not just marketing woo-woo,) the other features mentioned are a transparent attempt to ape that one Macbook that everyone in the world
hated, the one that was probably the reason Apple finally gave Jony Ive
the boot. Whose bright idea was *that!?*
I think people today are so world weary of marketing that the default assumption is that pretty much everything they see of hear through media owned or funded by rich people is a carefully constructed lie.
On 07/01/2026 22:49, rbowman wrote:
On Wed, 7 Jan 2026 13:30:14 +0100, Carlos E.R. wrote:Actually I welcome that. at leats 10% of the time the compiler finds a
It is possible that current C compilers signal many more problems that
back then, but not runtime errors.
gcc has become pickier. That isn't always a welcome thing when working
with legacy code and requires a search of the compiler options to get it
to shut up about such horrible heresies as assuming a function returns an
int.
bug that way, and the other 90% i upgrade the source to be more explicit...
On 08/01/2026 04:57, Charlie Gibbs wrote:
On 2026-01-07, John Ames <commodorejohn@gmail.com> wrote:Try being the operative word.
On 7 Jan 2026 02:00:12 GMT
rbowman <bowman@montana.com> wrote:
https://arstechnica.com/gadgets/2026/01/dells-xps-revival-is-a-welcome- >>>> reprieve-from-the-ai-pc-fad/
Does Dell see a little gnome with a pin approaching the bubble?
Shockingly, it turns out that businesses do better when they make and
sell things that people actually *want* o_O
The smart ones try to control what people want.
Remember, if you relieve people of all their net disposable income, your customer base disappears.
On 1/7/26 15:03, rbowman wrote:
The program language landscape changes so rapidly that whatever language
Colleges don't always make great choices and do their students a
disservice. At one time University of Montana used Modula-2, another
Wirth production. Later they chose Java after being offered financial
incentives by Sun. (I think it was before Oracle). Arguably a better
choice although it didn't do much when we were looking for C/C++
programmers.
you learn today will probably be niche in a few years. FORTRAN and COBOL
are still around, but I don't thinks anyone from the 70s would recognize them. I was there and I used both at the time.
On Wed, 7 Jan 2026 19:21:09 -0700, Peter Flass wrote:
FORTRAN and COBOL are still around, but I don't thinks anyone from the
70s would recognize them.
COBOL is still COBOL. Fortran has evolved somewhat, post-Fortran-77.
On 2026-01-07, The Natural Philosopher <tnp@invalid.invalid> wrote:
On 06/01/2026 21:06, c186282 wrote:
On 1/6/26 05:10, The Natural Philosopher wrote:Some assembler is...it's a choice. Especially Macro assembler...
On 06/01/2026 03:27, Peter Flass wrote:
I seem to recall reading that someone once wrote an OS in COBOL.
-aFrom what little I know COBOL looked very like assembler.
-a If assembler was RIDICULOUSLY WORDY
I remember CS weenies fawning over a language called pl360, the
misbegotten bastard child of Algol and 360 assembly language. :-p
C had forward declarations from early on ...
I hated Wirthian languages from the start and still do. Just bad
chemistry, I guess. But I couldn't stand having some snooty compiler
slap my wrist and tell me that I couldn't do what I could do in a couple
of lines of assembly language.
On 1/7/26 19:24, Lawrence DrCOOliveiro wrote:
On Wed, 7 Jan 2026 19:16:03 -0700, Peter Flass wrote:
I did a COBOL program to do string substitutions. The idea was that it
read a COBOL program written possibly by a blind hacker and substituted
variable names with longer, standardized ones.
Did it understand the rules of IN-scoping?
COBOL had no concept of "scope" back then.
On Thu, 08 Jan 2026 04:57:26 GMT, Charlie Gibbs wrote:
I hated Wirthian languages from the start and still do. Just bad
chemistry, I guess. But I couldn't stand having some snooty compiler
slap my wrist and tell me that I couldn't do what I could do in a couple
of lines of assembly language.
Ever run into PL/M?
On Thu, 08 Jan 2026 04:57:23 GMT, Charlie Gibbs wrote:
On 2026-01-07, Peter Flass <Peter@Iron-Spring.com> wrote:
Leave it to M$ (and IBM) to screw it up. Pascal was specifically
designed for fast one-pass compilation.
Is that why people wrote programs bottom-up (i.e. with the main
function at the bottom to avoid forward references)?
C is also like that. And C++, for all its enormous complexity in other
areas, preserves the tradition.
On 2026-01-08, The Natural Philosopher <tnp@invalid.invalid> wrote:
On 07/01/2026 22:49, rbowman wrote:
On Wed, 7 Jan 2026 13:30:14 +0100, Carlos E.R. wrote:Actually I welcome that. at leats 10% of the time the compiler finds a
It is possible that current C compilers signal many more problems
that back then, but not runtime errors.
gcc has become pickier. That isn't always a welcome thing when working
with legacy code and requires a search of the compiler options to get
it to shut up about such horrible heresies as assuming a function
returns an int.
bug that way, and the other 90% i upgrade the source to be more
explicit...
+1
I re-worked my code over time so that -Wall yields no errors.
And then a new version of gcc comes out which picks even more nits, and
the process repeats. Not being a quick-and-dirty type, I consider it a
win overall.
The one exception is its scrutiny of printf() calls.
That was a step too far, so I added -Wno-format-overflow.
On Thu, 8 Jan 2026 02:26:25 -0000 (UTC), Lawrence DrCOOliveiro wrote:
On Wed, 7 Jan 2026 19:21:09 -0700, Peter Flass wrote:
FORTRAN and COBOL are still around, but I don't thinks anyone from the
70s would recognize them.
COBOL is still COBOL. Fortran has evolved somewhat, post-Fortran-77.
Yeah, you don't need the continuation punch in column 6 :) I should take a look and see if that much has really changed,
It was never a good idea but the legacy code often defined a variable in
a .h file. The newer gcc implementations would throw multiple definition errors. Fixing it would have been painful. foo.h that defined int bar;
might be included in several different programs so you would have to hunt down all the uses and then define bar someplace in a .c file.
Great project for the new guy but at the time the newest guy had been
there for 20 years. Adding the compiler flag to the relevant makefiles was easier.
On Thu, 08 Jan 2026 04:57:26 GMT, Charlie Gibbs wrote:
On 2026-01-07, The Natural Philosopher <tnp@invalid.invalid> wrote:
On 06/01/2026 21:06, c186282 wrote:
On 1/6/26 05:10, The Natural Philosopher wrote:Some assembler is...it's a choice. Especially Macro assembler...
On 06/01/2026 03:27, Peter Flass wrote:
I seem to recall reading that someone once wrote an OS in COBOL.
-aFrom what little I know COBOL looked very like assembler.
-a If assembler was RIDICULOUSLY WORDY
I remember CS weenies fawning over a language called pl360, the
misbegotten bastard child of Algol and 360 assembly language. :-p
I don't remember that one but I do recall when PL/I was going to be the
one language to rule them all.
On Thu, 8 Jan 2026 07:00:14 -0000 (UTC), Lawrence DrCOOliveiro wrote:
On Thu, 08 Jan 2026 04:57:23 GMT, Charlie Gibbs wrote:
On 2026-01-07, Peter Flass <Peter@Iron-Spring.com> wrote:
Leave it to M$ (and IBM) to screw it up. Pascal was specifically
designed for fast one-pass compilation.
Is that why people wrote programs bottom-up (i.e. with the main
function at the bottom to avoid forward references)?
C is also like that. And C++, for all its enormous complexity in other
areas, preserves the tradition.
I usually put main() at the top of the file, preceded by the
declarations.
It's hard to brag about top-down development when you write your
program bottom-up ...
In multi-module programs I define my globals in a .h file as follows:
common.h
--------
#ifdef PRIMARY
#define GLOBAL
#else
#define GLOBAL extern
#endif
foo.h
-----
#include "common.h"
GLOBAL int foo;
foo1.c
------
#define PRIMARY
#include "foo.h"
int main(int argc, char **argv)
{
setfoo();
printf("foo is %d\n", foo);
exit(0);
}
foo2.c
------
#include "foo.h"
void setfoo()
{
foo = 5;
}
It works for me; I like having only one declaration of "foo"
in my source modules.
On Thu, 08 Jan 2026 04:57:27 GMT, Charlie Gibbs wrote:
The smart ones try to control what people want.
The smart ones know thatrCOs impossible. The best they can do is entice
the punters with attractive alternatives, and leave them to make the
choice.
Lawrence =?iso-8859-13?q?D=FFOliveiro?= <ldo@nz.invalid> writes:
On Wed, 7 Jan 2026 19:21:09 -0700, Peter Flass wrote:
FORTRAN and COBOL are still around, but I don't thinks anyone from
the 70s would recognize them.
COBOL is still COBOL. Fortran has evolved somewhat, post-Fortran-77.
Modern COBOL is very different than COBOL-68 (or even COBOL-84).
It even has pointers.
On Thu, 8 Jan 2026 11:20:01 +0000
The Natural Philosopher <tnp@invalid.invalid> wrote:
Huge amounts of perfectly useable technology are 'frozen in time'
My coffee beaker is no different in principle from a bronze age
beaker.
Round wheels predate the Ark...
But if existing solutions are basically fine, how are vendors supposed
to sell new ones, I ask you?
On 2026-01-08, The Natural Philosopher <tnp@invalid.invalid> wrote:
On 08/01/2026 04:57, Charlie Gibbs wrote:
On 2026-01-07, John Ames <commodorejohn@gmail.com> wrote:Try being the operative word.
On 7 Jan 2026 02:00:12 GMT
rbowman <bowman@montana.com> wrote:
https://arstechnica.com/gadgets/2026/01/dells-xps-revival-is-a-welcome- >>>>> reprieve-from-the-ai-pc-fad/
Does Dell see a little gnome with a pin approaching the bubble?
Shockingly, it turns out that businesses do better when they make and
sell things that people actually *want* o_O
The smart ones try to control what people want.
Remember, if you relieve people of all their net disposable income, your
customer base disappears.
This is why a good parasite won't bleed its host completely white.
The exception to this is if there's such an abundance of potential
hosts that you can afford to use them up and throw them away.
This is why governments and large corporations are so much in
favour of population growth.
C++'s "//" construct is a lot easier to code.
Hmm ... remember the "spinning rims" fetish about a decade ago ?
On Wed, 7 Jan 2026 19:21:09 -0700, Peter Flass wrote:
On 1/7/26 15:03, rbowman wrote:
The program language landscape changes so rapidly that whatever language
Colleges don't always make great choices and do their students a
disservice. At one time University of Montana used Modula-2, another
Wirth production. Later they chose Java after being offered financial
incentives by Sun. (I think it was before Oracle). Arguably a better
choice although it didn't do much when we were looking for C/C++
programmers.
you learn today will probably be niche in a few years. FORTRAN and COBOL
are still around, but I don't thinks anyone from the 70s would recognize
them. I was there and I used both at the time.
I'm comfortable up to Fortran 77 but would have to learn the current
version. However, I've used C for about 45 years and it still looks like
C.
https://www.tiobe.com/tiobe-index/
Going from Python2 to Python3 required some updating but it wasn't a relearning process. I've got a first edition little book, Lutz's 'Python Pocket Reference', from 1998. It would require very few edits to bring it
up to date.
I haven't kept up with C++ but my use has always been a subset of the full language.
Sure, some languages never caught on. Go is on the list but the change was the wrong way. Ada hangs on, mostly for government projects but follows Scratch. Ruby didn't scale and is a footnote. Pike was always niche. The
list goes on.
If I had a kid in college I would hope for Python as the didactic
language. C would be good but academics don't seem to like it. Not enough arcane points to fill a semester? C++, maybe. Java, I suppose, although
I've seen the aftermath when people trained in Java try to use languages
with less hand holding and try to unravel ***foo.
On 1/8/26 09:43, Scott Lurndal wrote:
Lawrence =?iso-8859-13?q?D=FFOliveiro?= <ldo@nz.invalid> writes:
On Wed, 7 Jan 2026 19:21:09 -0700, Peter Flass wrote:
FORTRAN and COBOL are still around, but I don't thinks anyone from
the 70s would recognize them.
COBOL is still COBOL. Fortran has evolved somewhat, post-Fortran-77.
Modern COBOL is very different than COBOL-68 (or even COBOL-84).
It even has pointers.
Then is it even still "COBOL" ? "NuBOL" instead ?
How long before Japan and Korea (and a lot of others soon after)
cease to be whatever they were ? They keep importing young foreign
labor,
more and more and more, which means whatever the culture/history was
keeps evaporating.
On Thu, 8 Jan 2026 02:26:25 -0000 (UTC), Lawrence DrCOOliveiro wrote:
On Wed, 7 Jan 2026 19:21:09 -0700, Peter Flass wrote:
FORTRAN and COBOL are still around, but I don't thinks anyone from the
70s would recognize them.
COBOL is still COBOL. Fortran has evolved somewhat, post-Fortran-77.
Yeah, you don't need the continuation punch in column 6 :) I should take a look and see if that much has really changed,
On Thu, 08 Jan 2026 04:57:26 GMT, Charlie Gibbs wrote:
On 2026-01-07, The Natural Philosopher <tnp@invalid.invalid> wrote:
On 06/01/2026 21:06, c186282 wrote:
On 1/6/26 05:10, The Natural Philosopher wrote:Some assembler is...it's a choice. Especially Macro assembler...
On 06/01/2026 03:27, Peter Flass wrote:
I seem to recall reading that someone once wrote an OS in COBOL.
-aFrom what little I know COBOL looked very like assembler.
-a If assembler was RIDICULOUSLY WORDY
I remember CS weenies fawning over a language called pl360, the
misbegotten bastard child of Algol and 360 assembly language. :-p
I don't remember that one but I do recall when PL/I was going to be the
one language to rule them all.
rbowman <bowman@montana.com> writes:
On Thu, 08 Jan 2026 04:57:26 GMT, Charlie Gibbs wrote:
I hated Wirthian languages from the start and still do. Just bad
chemistry, I guess. But I couldn't stand having some snooty compiler
slap my wrist and tell me that I couldn't do what I could do in a
couple of lines of assembly language.
Ever run into PL/M?
I have a listing of the PL/M 8080 cross-compiler somewhere in storage.
On 1/8/26 09:43, Scott Lurndal wrote:
Lawrence =?iso-8859-13?q?D=FFOliveiro?= <ldo@nz.invalid> writes:
On Wed, 7 Jan 2026 19:21:09 -0700, Peter Flass wrote:
FORTRAN and COBOL are still around, but I don't thinks anyone from
the 70s would recognize them.
COBOL is still COBOL. Fortran has evolved somewhat, post-Fortran-77.
Modern COBOL is very different than COBOL-68 (or even COBOL-84).
It even has pointers.
-a Then is it even still "COBOL" ? "NuBOL" instead ?
How long before Japan and Korea (and a lot of others soon after)
cease to be whatever they were ? They keep importing young foreign
labor, more and more and more, which means whatever the
culture/history was keeps evaporating.
Soon 'Japan' will just be a geographic name, not anything to do with
an ancient culture, not anything with a history.
On Thu, 8 Jan 2026 20:23:33 -0500
c186282 <c186282@nnada.net> wrote:
How long before Japan and Korea (and a lot of others soon after)
cease to be whatever they were ? They keep importing young foreign
labor, more and more and more, which means whatever the
culture/history was keeps evaporating.
Soon 'Japan' will just be a geographic name, not anything to do with
an ancient culture, not anything with a history.
Spoiler alert, that's *all of history* - we're just more aware of it
now. Try reading medieval literature sometime, and count the number of references to tribes and states that are just names on a map or foot-
notes in the distant history of some present-day ethnic group.
On Thu, 8 Jan 2026 20:23:33 -0500, c186282 wrote:
How long before Japan and Korea (and a lot of others soon after)
cease to be whatever they were ? They keep importing young foreign
labor,
more and more and more, which means whatever the culture/history was
keeps evaporating.
I'm wondering how that will go over. A third generation Korean in Japan is still that damn Korean.
Other regions, even in 'blender' areas, still DO have a certain
'national character' and 'common history'. Turkey is NOT like Germany
is NOT like England is NOT like France.
On 1/8/26 17:48, rbowman wrote:
On Thu, 8 Jan 2026 20:23:33 -0500, c186282 wrote:
-a-a-a How long before Japan and Korea (and a lot of others soon after)
-a-a-a cease to be whatever they were ? They keep importing young foreign >>> -a-a-a labor,
-a-a-a more and more and more, which means whatever the culture/history was >>> -a-a-a keeps evaporating.
I'm wondering how that will go over. A third generation Korean in
Japan is
still that damn Korean.
-a-a-a-aYes and people who worked at trades like tanning and leather crafting as
-awell as butchers were traditional outcasts and still are rejected by other Japanese.
-a-a-a-aKoreans have been brought to Japan since its earliest days as an Empire to
enrich the culture with their arts and religious knowlege and hundreds
of years
back when Japan had invaded Korea under Hideyoshi many artisans were
willing
to flee to Japan to escape the strife that the Japanese had brought to Korea.
-a-a-a-aBut Japan recently employed lots of foreign workers in low paid jobs and housed them in very inadequate conditions. Recently means for me in
the last 20-25 years.
-a-a-a-aSource about centuries back in manga: HYOUGE MONO Manga about
-athe-a very real life of this accomplished tea master who was Sasuke Furuta,
-a but ended up-a as Tea Master to Hideyoshi.-a Incredible manga was available
-a on line with but like the real life the story has a rather bitter ending.
On Thu, 8 Jan 2026 22:15:11 -0500
c186282 <c186282@nnada.net> wrote:
Other regions, even in 'blender' areas, still DO have a certain
'national character' and 'common history'. Turkey is NOT like Germany
is NOT like England is NOT like France.
They do now - but they had a different character once upon a time.
England used to be a bunch of Celts and a handful of Roman expats 'til
the Germanic tribes rolled in; then it was a bunch of Saxons squabbling
with their Scots and Welsh neighbors 'til the Normans steamrolled
everyone - and the Normans themselves were Vikings "gone native" in
France (like the Rus over in Kyiv.) And the "native" French were just a *different* blend of Gallic, Germanic, and Latin, way back when. Turkey
useta be Phrygia, back in the mists of time...
All of history's successive tides shaped the world we know today, and
all the things happening now will shape what comes after; that is, as
they say, the way of things.
It's just that prior to getting the facts kinda approximately more-or-
less straight-ish in the last few centuries, we had a *lot* less clear
of a picture of it - and a huge part of what's shaped *this* period of history, for better and for worse, is the collective culture shock of realizing that practically *every* modern-day culture* is a relative
newcomer standing in the ruins of countless older societies with which
they may or may not have anything much in common.
* (Less a few outliers like, yes, east Asia - but even Japanese history
has its wrinkles, they just don't like to talk about them. Just ask
the Ainu...)
Recently, I've been writing code with no global variables. It's been
a fun experiment.
On Wed, 7 Jan 2026 19:21:09 -0700, Peter Flass wrote:
On 1/7/26 15:03, rbowman wrote:
The program language landscape changes so rapidly that whatever language
Colleges don't always make great choices and do their students a
disservice. At one time University of Montana used Modula-2, another
Wirth production. Later they chose Java after being offered financial
incentives by Sun. (I think it was before Oracle). Arguably a better
choice although it didn't do much when we were looking for C/C++
programmers.
you learn today will probably be niche in a few years. FORTRAN and COBOL
are still around, but I don't thinks anyone from the 70s would recognize
them. I was there and I used both at the time.
I'm comfortable up to Fortran 77 but would have to learn the current
version. However, I've used C for about 45 years and it still looks like
C.
https://www.tiobe.com/tiobe-index/
Going from Python2 to Python3 required some updating but it wasn't a relearning process. I've got a first edition little book, Lutz's 'Python Pocket Reference', from 1998. It would require very few edits to bring it
up to date.
I haven't kept up with C++ but my use has always been a subset of the full language.
Sure, some languages never caught on. Go is on the list but the change was the wrong way. Ada hangs on, mostly for government projects but follows Scratch. Ruby didn't scale and is a footnote. Pike was always niche. The
list goes on.
If I had a kid in college I would hope for Python as the didactic
language. C would be good but academics don't seem to like it. Not enough arcane points to fill a semester? C++, maybe. Java, I suppose, although
I've seen the aftermath when people trained in Java try to use languages
with less hand holding and try to unravel ***foo.
Are you another Japan-Hater ???
On 1/8/26 14:52, rbowman wrote:
On Thu, 8 Jan 2026 02:26:25 -0000 (UTC), Lawrence DrCOOliveiro wrote:
On Wed, 7 Jan 2026 19:21:09 -0700, Peter Flass wrote:
FORTRAN and COBOL are still around, but I don't thinks anyone from
the 70s would recognize them.
COBOL is still COBOL. Fortran has evolved somewhat, post-Fortran-77.
Yeah, you don't need the continuation punch in column 6 :) I should
take a look and see if that much has really changed,
AAAAUUUGGGHHH ! You just triggered my PTSD about FORTRAN and PUNCH
CARDS !!! :-)
Huh ? You're demonizing Japan ? Most EVERY nation/culture can be
demonized, and/or lauded.
However Japan IS a bit different ... their geographics did let them
build a kind of singular culture over a very long period.
They do now - but they had a different character once upon a time.
England used to be a bunch of Celts and a handful of Roman expats 'til
the Germanic tribes rolled in; then it was a bunch of Saxons squabbling
with their Scots and Welsh neighbors 'til the Normans steamrolled
everyone - and the Normans themselves were Vikings "gone native" in
France (like the Rus over in Kyiv.) And the "native" French were just a *different* blend of Gallic, Germanic, and Latin, way back when. Turkey
useta be Phrygia, back in the mists of time...
Finding out where they were defined, across over a million lines of
source code, was a fun exercise. I learned somethings about ctags along
the way ...
On Thu, 8 Jan 2026 20:15:24 -0500, c186282 wrote:Golly that was a long time ago...garterettes?
On 1/8/26 09:43, Scott Lurndal wrote:
Lawrence =?iso-8859-13?q?D=FFOliveiro?= <ldo@nz.invalid> writes:
On Wed, 7 Jan 2026 19:21:09 -0700, Peter Flass wrote:
FORTRAN and COBOL are still around, but I don't thinks anyone from
the 70s would recognize them.
COBOL is still COBOL. Fortran has evolved somewhat, post-Fortran-77.
Modern COBOL is very different than COBOL-68 (or even COBOL-84).
It even has pointers.
Then is it even still "COBOL" ? "NuBOL" instead ?
That triggered a distant memory of SNOBOL.
Salt LakeThey put their trust in Jesus, not brakes
is the all time worse but some people think STOP is an acronym for Slight
Tap On Pedal.
They do now - but they had a different character once upon a time.It was other people before that. too. Celts are late invaders from the
England used to be a bunch of Celts and a handful of Roman expats 't
On Thu, 08 Jan 2026 22:45:45 GMT, Charlie Gibbs wrote:
C++'s "//" construct is a lot easier to code.
That is something I was happy to see adapted by C, JavaScript and other C like languages. I don't know who had it first. If it's used consistently
it makes commenting out blocks easier although '#if 0' works.
On Thu, 08 Jan 2026 20:09:03 GMT, Scott Lurndal wrote:
rbowman <bowman@montana.com> writes:
On Thu, 08 Jan 2026 04:57:26 GMT, Charlie Gibbs wrote:
I hated Wirthian languages from the start and still do. Just bad
chemistry, I guess. But I couldn't stand having some snooty compiler
slap my wrist and tell me that I couldn't do what I could do in a
couple of lines of assembly language.
Ever run into PL/M?
I have a listing of the PL/M 8080 cross-compiler somewhere in storage.
iirc the Mostek AID-80F development system had a native PL/M
implementation. It was almost, but not quite, CP/M.
On 8 Jan 2026 at 19:49:16, rbowman <bowman@montana.com> wrote:
On Wed, 7 Jan 2026 19:21:09 -0700, Peter Flass wrote:
On 1/7/26 15:03, rbowman wrote:
The program language landscape changes so rapidly that whatever language >>> you learn today will probably be niche in a few years. FORTRAN and COBOL >>> are still around, but I don't thinks anyone from the 70s would recognize >>> them. I was there and I used both at the time.
Colleges don't always make great choices and do their students a
disservice. At one time University of Montana used Modula-2, another
Wirth production. Later they chose Java after being offered financial
incentives by Sun. (I think it was before Oracle). Arguably a better
choice although it didn't do much when we were looking for C/C++
programmers.
I'm comfortable up to Fortran 77 but would have to learn the current
version. However, I've used C for about 45 years and it still looks like
C.
https://www.tiobe.com/tiobe-index/
Going from Python2 to Python3 required some updating but it wasn't a
relearning process. I've got a first edition little book, Lutz's 'Python
Pocket Reference', from 1998. It would require very few edits to bring it
up to date.
I haven't kept up with C++ but my use has always been a subset of the full >> language.
Sure, some languages never caught on. Go is on the list but the change was >> the wrong way. Ada hangs on, mostly for government projects but follows
Scratch. Ruby didn't scale and is a footnote. Pike was always niche. The
list goes on.
If I had a kid in college I would hope for Python as the didactic
language. C would be good but academics don't seem to like it. Not enough
arcane points to fill a semester? C++, maybe. Java, I suppose, although
I've seen the aftermath when people trained in Java try to use languages
with less hand holding and try to unravel ***foo.
Python is OK but Rexx is better.
On Thu, 8 Jan 2026 20:54:14 -0500, c186282 wrote:
On 1/8/26 14:52, rbowman wrote:
On Thu, 8 Jan 2026 02:26:25 -0000 (UTC), Lawrence DrCOOliveiro wrote:
On Wed, 7 Jan 2026 19:21:09 -0700, Peter Flass wrote:
FORTRAN and COBOL are still around, but I don't thinks anyone from
the 70s would recognize them.
COBOL is still COBOL. Fortran has evolved somewhat, post-Fortran-77.
Yeah, you don't need the continuation punch in column 6 :) I should
take a look and see if that much has really changed,
AAAAUUUGGGHHH ! You just triggered my PTSD about FORTRAN and PUNCH
CARDS !!! :-)
Don't forget the coding forms.
https://archive.org/details/fortrancodingform
More horrors from the past:
https://www.math-cs.gordon.edu/courses/cs323/FORTRAN/fortran.html
I was so scarred by the initial brush with programming it was about 10
years before I had any interest in it. Of course the game had changed. You could wirewrap up a working Z80 on the kitchen table and replace a 3'x3' panel full of ice cube relays or a bushel of TTLs with a much less
physical implementation of logic.\
On 09/01/2026 01:35, rbowman wrote:
On Thu, 08 Jan 2026 22:45:45 GMT, Charlie Gibbs wrote:For a block I use
C++'s "//" construct is a lot easier to code.
That is something I was happy to see adapted by C, JavaScript and other C
like languages. I don't know who had it first. If it's used consistently
it makes commenting out blocks easier although '#if 0'-a works.
/*
...
*/
Bit shorter than
#if 0
...
#endif
Don't forget the Danes and Norwegians...
On 09/01/2026 02:02, rbowman wrote:
On Thu, 08 Jan 2026 20:09:03 GMT, Scott Lurndal wrote:PL/M was a language. CP/M was almost an operating system
rbowman <bowman@montana.com> writes:
On Thu, 08 Jan 2026 04:57:26 GMT, Charlie Gibbs wrote:
I hated Wirthian languages from the start and still do. Just bad
chemistry, I guess. But I couldn't stand having some snooty compiler >>>>> slap my wrist and tell me that I couldn't do what I could do in a
couple of lines of assembly language.
Ever run into PL/M?
I have a listing of the PL/M 8080 cross-compiler somewhere in storage.
iirc the Mostek AID-80F development system had a native PL/M
implementation. It was almost, but not quite, CP/M.
On Thu, 8 Jan 2026 07:00:14 -0000 (UTC), Lawrence DrCOOliveiro wrote:
On Thu, 08 Jan 2026 04:57:23 GMT, Charlie Gibbs wrote:
On 2026-01-07, Peter Flass <Peter@Iron-Spring.com> wrote:
Leave it to M$ (and IBM) to screw it up. Pascal was specifically
designed for fast one-pass compilation.
Is that why people wrote programs bottom-up (i.e. with the main
function at the bottom to avoid forward references)?
C is also like that. And C++, for all its enormous complexity in other
areas, preserves the tradition.
I usually put main() at the top of the file, preceded by the
declarations.
On 2026-01-08, rbowman <bowman@montana.com> wrote:
On Thu, 8 Jan 2026 02:26:25 -0000 (UTC), Lawrence DrCOOliveiro wrote:
On Wed, 7 Jan 2026 19:21:09 -0700, Peter Flass wrote:
FORTRAN and COBOL are still around, but I don't thinks anyone from the >>>> 70s would recognize them.
COBOL is still COBOL. Fortran has evolved somewhat, post-Fortran-77.
Yeah, you don't need the continuation punch in column 6 :) I should take a >> look and see if that much has really changed,
The one WATFIV extension I recall was a magic character which caused
the remainder of the card to be treated as comments. People called
this character a "zigamorph"; you produced it on a keypunch by
using the multi-punch key to punch 12-11-0-7-8-9 in one column.
In an EBCDIC card reader this translates to 0xFF.
They do now - but they had a different character once upon a time.
England used to be a bunch of Celts and a handful of Roman expats
It was other people before that. too. Celts are late invaders from
the broinze age.
On 2026-01-08, rbowman <bowman@montana.com> wrote:
It was never a good idea but the legacy code often defined a variable in
a .h file. The newer gcc implementations would throw multiple definition
errors. Fixing it would have been painful. foo.h that defined int bar;
might be included in several different programs so you would have to hunt >> down all the uses and then define bar someplace in a .c file.
Great project for the new guy but at the time the newest guy had been
there for 20 years. Adding the compiler flag to the relevant makefiles was >> easier.
In multi-module programs I define my globals in a .h file as follows:
common.h
--------
#ifdef PRIMARY
#define GLOBAL
#else
#define GLOBAL extern
#endif
On Thu, 8 Jan 2026 22:15:11 -0500, c186282 wrote:
However Japan IS a bit different ... their geographics did let them
build a kind of singular culture over a very long period.
Their culture doesn't like to examine its roots. If it wasn't for Koreans teaching them how to grow rice they'd still be eating millet. The
calligraphy is mostly Chinese eve if it is pronounced differently. Shinto
is homegrown but Buddhism came from the west.
That's not to say there weren't tweaks. Avalokiteshvara had a sex change
and became Kannon, who has overtones of Amaterasu, The Kirishitans blended Kannon with the Virgin Mary. Very adaptable people.
All of history's successive tides shaped the world we know today, and
all the things happening now will shape what comes after; that is, as
they say, the way of things.
Never looked at Python, but I'm a huge Rexx fan.
I spent a couple of years writing FORTRAN for the 1130. They called
it FORTRAN IV, but it was more like III.V, but still better than OS
FORTRAN at the time. Later I worked on an XDS Sigma system, and
their FORTRAN was great (as you'd expect with its SDS heritage). At
the time I liked the language, but I always preferred PL/I.
On 1/9/26 03:02, The Natural Philosopher wrote:
On 09/01/2026 01:35, rbowman wrote:
On Thu, 08 Jan 2026 22:45:45 GMT, Charlie Gibbs wrote:For a block I use
C++'s "//" construct is a lot easier to code.
That is something I was happy to see adapted by C, JavaScript and
other C
like languages. I don't know who had it first. If it's used consistently >>> it makes commenting out blocks easier although '#if 0'-a works.
/*
...
*/
Bit shorter than
#if 0
...
#endif
Great as long as the block doesn't contain comments.
On 9 Jan 2026 07:04:49 GMT
rbowman <bowman@montana.com> wrote:
Don't forget the Danes and Norwegians...
Also true, though they'd hardly even gotten settled in when William
decided to take a jaunt across the Channel and do some conquerin'...
The Natural Philosopher <tnp@invalid.invalid> writes:
On 09/01/2026 02:02, rbowman wrote:
On Thu, 08 Jan 2026 20:09:03 GMT, Scott Lurndal wrote:PL/M was a language. CP/M was almost an operating system
rbowman <bowman@montana.com> writes:
On Thu, 08 Jan 2026 04:57:26 GMT, Charlie Gibbs wrote:
I hated Wirthian languages from the start and still do. Just bad
chemistry, I guess. But I couldn't stand having some snooty compiler >>>>>> slap my wrist and tell me that I couldn't do what I could do in a
couple of lines of assembly language.
Ever run into PL/M?
I have a listing of the PL/M 8080 cross-compiler somewhere in storage.
iirc the Mostek AID-80F development system had a native PL/M
implementation. It was almost, but not quite, CP/M.
So? Mr. Bowman's comment referred to the AID-80F development system.
On Fri, 9 Jan 2026 10:00:20 +0000
The Natural Philosopher <tnp@invalid.invalid> wrote:
They do now - but they had a different character once upon a time.
England used to be a bunch of Celts and a handful of Roman expats
It was other people before that. too. Celts are late invaders from
the broinze age.
Also true - and the different Neolithic and early Bronze Age cultures
crossed whole *swaths* of Eurasia, in the Elder Days.
There are claims that American copper (id-ed by Isotope) was on many
bronze age tools in Europe.... Did people cross the Atlantic? Was
there a land bridge?
But trying to parse free-form text and do macro expansions with
string substitutions ... disaster in COBOL. Hard enough in FORTRAN.
On 9 Jan 2026 07:04:49 GMT rbowman <bowman@montana.com> wrote:
Don't forget the Danes and Norwegians...
Also true, though they'd hardly even gotten settled in when William
decided to take a jaunt across the Channel and do some conquerin'...
Never looked at Python, but I'm a huge Rexx fan. I used to use it all
the time (MVS, VM, and OS/2). Now I use it less (Linux), to the extent
that I often have to refresh my knowledge, but I have several vital
utilities written in Rexx.
On 09/01/2026 16:02, Scott Lurndal wrote:
The Natural Philosopher <tnp@invalid.invalid> writes:
On 09/01/2026 02:02, rbowman wrote:
On Thu, 08 Jan 2026 20:09:03 GMT, Scott Lurndal wrote:PL/M was a language. CP/M was almost an operating system
rbowman <bowman@montana.com> writes:
On Thu, 08 Jan 2026 04:57:26 GMT, Charlie Gibbs wrote:
I hated Wirthian languages from the start and still do. Just bad >>>>>>> chemistry, I guess. But I couldn't stand having some snooty
compiler slap my wrist and tell me that I couldn't do what I could >>>>>>> do in a couple of lines of assembly language.
Ever run into PL/M?
I have a listing of the PL/M 8080 cross-compiler somewhere in
storage.
iirc the Mostek AID-80F development system had a native PL/M
implementation. It was almost, but not quite, CP/M.
So? Mr. Bowman's comment referred to the AID-80F development system.
Did it? It was ambiguous.
On 09/01/2026 04:13, John Ames wrote:
They do now - but they had a different character once upon a time.It was other people before that. too. Celts are late invaders from the broinze age.
England used to be a bunch of Celts and a handful of Roman expats 't
On Fri, 9 Jan 2026 18:51:43 +0000 The Natural Philosopher <tnp@invalid.invalid> wrote:
There are claims that American copper (id-ed by Isotope) was on many
bronze age tools in Europe.... Did people cross the Atlantic? Was there
a land bridge?
That *is* an intriguing question - AFAIK the evidence we have is scant,
but it's certainly a fascinating notion. Dunno if we'll ever get any
solid answers, but you gotta wonder...
On 09/01/2026 01:35, rbowman wrote:
On Thu, 08 Jan 2026 22:45:45 GMT, Charlie Gibbs wrote:For a block I use /*
C++'s "//" construct is a lot easier to code.
That is something I was happy to see adapted by C, JavaScript and other
C like languages. I don't know who had it first. If it's used
consistently it makes commenting out blocks easier although '#if 0'
works.
...
*/
Bit shorter than #if 0 ...
#endif
On 09/01/2026 15:16, Peter Flass wrote:
On 1/9/26 03:02, The Natural Philosopher wrote:
On 09/01/2026 01:35, rbowman wrote:Great as long as the block doesn't contain comments.
On Thu, 08 Jan 2026 22:45:45 GMT, Charlie Gibbs wrote:For a block I use /*
C++'s "//" construct is a lot easier to code.
That is something I was happy to see adapted by C, JavaScript and
other C like languages. I don't know who had it first. If it's used
consistently it makes commenting out blocks easier although '#if 0'-a
works.
...
*/
Bit shorter than #if 0 ...
#endif
Comments are reserved either ror this /********************************************* * This is a comment and conmatains no code * **********************************************/
Or somecode('blah'); // Blah processing unit.
which is easy enough to asterisk out
That *is* an intriguing question - AFAIK the evidence we have is
scant, but it's certainly a fascinating notion. Dunno if we'll ever
get any solid answers, but you gotta wonder...
Heyerdahl was disliked by the academics but he had an embarrassing
habit of building boats and going places that shouldn't have been
reachable in their theories.
Before Doggerland sank anybody could wander over without having to
build a coracle.
Using "OCCURS DEPENDING ON" COBOL easily processes variable length,
variably located strings.
On Thu, 08 Jan 2026 22:45:45 GMT, Charlie Gibbs wrote:
C++'s "//" construct is a lot easier to code.
That is something I was happy to see adapted by C, JavaScript and other C >like languages. I don't know who had it first. If it's used consistently
it makes commenting out blocks easier although '#if 0' works.
On Fri, 9 Jan 2026 10:02:41 +0000, The Natural Philosopher wrote:
On 09/01/2026 01:35, rbowman wrote:
On Thu, 08 Jan 2026 22:45:45 GMT, Charlie Gibbs wrote:For a block I use /*
C++'s "//" construct is a lot easier to code.
That is something I was happy to see adapted by C, JavaScript and other
C like languages. I don't know who had it first. If it's used
consistently it makes commenting out blocks easier although '#if 0'
works.
...
*/
Bit shorter than #if 0 ...
#endif
Certainly. Unless someone snuck in /* stupid comment */ over in column
100 where you overlooked it.
On Fri, 9 Jan 2026 18:48:06 +0000, The Natural Philosopher wrote:
On 09/01/2026 16:02, Scott Lurndal wrote:
The Natural Philosopher <tnp@invalid.invalid> writes:
On 09/01/2026 02:02, rbowman wrote:
On Thu, 08 Jan 2026 20:09:03 GMT, Scott Lurndal wrote:PL/M was a language. CP/M was almost an operating system
rbowman <bowman@montana.com> writes:
On Thu, 08 Jan 2026 04:57:26 GMT, Charlie Gibbs wrote:
I hated Wirthian languages from the start and still do.
Just bad chemistry, I guess. But I couldn't stand
having some snooty compiler slap my wrist and tell me
that I couldn't do what I could do in a couple of lines
of assembly language.
Ever run into PL/M?
I have a listing of the PL/M 8080 cross-compiler somewhere
in storage.
iirc the Mostek AID-80F development system had a native PL/M
implementation. It was almost, but not quite, CP/M.
So? Mr. Bowman's comment referred to the AID-80F development
system.
Did it? It was ambiguous.
https://deramp.com/mostek.html
To clarify, the system ran M/OS-80 which was very much like CP/M.
I believe there was an implementation of the PL/M language available.It's been a day or two. I know I used it to burn EPROMs but I worked
with the Z80 assembler, not PL/M.
On 9 Jan 2026 20:36:38 GMT
rbowman <bowman@montana.com> wrote:
That *is* an intriguing question - AFAIK the evidence we have is
scant, but it's certainly a fascinating notion. Dunno if we'll ever
get any solid answers, but you gotta wonder...
Heyerdahl was disliked by the academics but he had an embarrassing
habit of building boats and going places that shouldn't have been
reachable in their theories.
Certainly can't accuse him of not putting his money where his mouth was.
Before Doggerland sank anybody could wander over without having to
build a coracle.
It's truly amazing how much of the world was walkable in the Ice Age;
doesn't explain *every* place humans ended up (it's absolutely mind-
boggling to consider how far back the Pacific islands were settled,)
but it absolutely made a whole lotta places readily accessible for a
good long while. Makes you wonder, too, how many of the various quasi- Atlantean legends in northwest Europe are really mutated folk memory
from a *staggeringly* long time ago...
On Fri, 9 Jan 2026 10:02:41 +0000, The Natural Philosopher wrote:Or even a /* #endif */
On 09/01/2026 01:35, rbowman wrote:
On Thu, 08 Jan 2026 22:45:45 GMT, Charlie Gibbs wrote:For a block I use /*
C++'s "//" construct is a lot easier to code.
That is something I was happy to see adapted by C, JavaScript and other
C like languages. I don't know who had it first. If it's used
consistently it makes commenting out blocks easier although '#if 0'
works.
...
*/
Bit shorter than #if 0 ...
#endif
Certainly. Unless someone snuck in /* stupid comment */ over in column
100 where you overlooked it.
On Wed, 7 Jan 2026 19:21:09 -0700, Peter Flass wrote:
FORTRAN and COBOL are still around, but I don't thinks anyone from
the 70s would recognize them.
COBOL is still COBOL. Fortran has evolved somewhat, post-Fortran-77.
On 09/01/2026 21:24, John Ames wrote:
On 9 Jan 2026 20:36:38 GMTYes.
rbowman <bowman@montana.com> wrote:
That *is* an intriguing question - AFAIK the evidence we have is
scant, but it's certainly a fascinating notion. Dunno if we'll ever
get any solid answers, but you gotta wonder...
Heyerdahl was disliked by the academics but he had an embarrassing
habit of building boats and going places that shouldn't have been
reachable in their theories.
Certainly can't accuse him of not putting his money where his mouth was.
Before Doggerland sank anybody could wander over without having to
build a coracle.
It's truly amazing how much of the world was walkable in the Ice Age;
doesn't explain *every* place humans ended up (it's absolutely mind-
boggling to consider how far back the Pacific islands were settled,)
but it absolutely made a whole lotta places readily accessible for a
good long while. Makes you wonder, too, how many of the various quasi-
Atlantean legends in northwest Europe are really mutated folk memory
from a *staggeringly* long time ago...
125m of sea level rise in a few-a thousand years...and a global
temperature rise of
up to 10-#C
Odd how that didn't 'destroy the planet'...
On 9 Jan 2026 20:36:38 GMT
rbowman <bowman@montana.com> wrote:
That *is* an intriguing question - AFAIK the evidence we have is
scant, but it's certainly a fascinating notion. Dunno if we'll ever
get any solid answers, but you gotta wonder...
Heyerdahl was disliked by the academics but he had an embarrassing
habit of building boats and going places that shouldn't have been
reachable in their theories.
Certainly can't accuse him of not putting his money where his mouth was.
Before Doggerland sank anybody could wander over without having to
build a coracle.
It's truly amazing how much of the world was walkable in the Ice Age;
doesn't explain *every* place humans ended up (it's absolutely mind-
boggling to consider how far back the Pacific islands were settled,)
but it absolutely made a whole lotta places readily accessible for a
good long while. Makes you wonder, too, how many of the various quasi- Atlantean legends in northwest Europe are really mutated folk memory
from a *staggeringly* long time ago...
On Fri, 09 Jan 2026 14:22:34 -0500, Dan Espen wrote:
Using "OCCURS DEPENDING ON" COBOL easily processes variable length,
variably located strings.
Up to a limit, always, e.g.
OCCURS [ integer-1 TO ] integer-2 TIMES [ DEPENDING ON data-name-1 ]
It's truly amazing how much of the world was walkable in the Ice
Age; doesn't explain *every* place humans ended up (it's absolutely
mind- boggling to consider how far back the Pacific islands were
settled,) but it absolutely made a whole lotta places readily
accessible for a good long while. Makes you wonder, too, how many
of the various quasi- Atlantean legends in northwest Europe are
really mutated folk memory from a *staggeringly* long time ago...
It's also instructive to realize how badly humans wanted to get away
from their neighbors.
On 09/01/2026 21:24, John Ames wrote:
On 9 Jan 2026 20:36:38 GMTYes.
rbowman <bowman@montana.com> wrote:
That *is* an intriguing question - AFAIK the evidence we have is
scant, but it's certainly a fascinating notion. Dunno if we'll ever
get any solid answers, but you gotta wonder...
Heyerdahl was disliked by the academics but he had an embarrassing
habit of building boats and going places that shouldn't have been
reachable in their theories.
Certainly can't accuse him of not putting his money where his mouth was.
Before Doggerland sank anybody could wander over without having to
build a coracle.
It's truly amazing how much of the world was walkable in the Ice Age;
doesn't explain *every* place humans ended up (it's absolutely mind-
boggling to consider how far back the Pacific islands were settled,)
but it absolutely made a whole lotta places readily accessible for a
good long while. Makes you wonder, too, how many of the various quasi-
Atlantean legends in northwest Europe are really mutated folk memory
from a *staggeringly* long time ago...
125m of sea level rise in a few thousand years...and a global
temperature rise of
up to 10-#C
Odd how that didn't 'destroy the planet'...
'C' has added a few nicey-nice things, but not TOO much.
You can (I do) stick pretty much to K&R and everything
still works fine.
On Thu, 08 Jan 2026 22:45:45 GMT, Charlie Gibbs wrote:
C++'s "//" construct is a lot easier to code.
That is something I was happy to see adapted by C, JavaScript and other C like languages. I don't know who had it first. If it's used consistently
it makes commenting out blocks easier although '#if 0' works.
It did if you lived in Doggerland, or used to walk from Australia to Indonesia.
On Sat, 10 Jan 2026 07:42:47 -0700, Peter Flass wrote:
It did if you lived in Doggerland, or used to walk from Australia to
Indonesia.
https://en.wikipedia.org/wiki/Stone_Spring
The rest of the trilogy, 'Bronze Summer' and 'Iron Winter', are okay but
the focus moves from Doggerland.
Lawrence DrCOOliveiro <ldo@nz.invalid> writes:
On Fri, 09 Jan 2026 14:22:34 -0500, Dan Espen wrote:
Using "OCCURS DEPENDING ON" COBOL easily processes variable
length, variably located strings.
Up to a limit, always, e.g.
OCCURS [ integer-1 TO ] integer-2 TIMES [ DEPENDING ON data-name-1 ]
Well beyond any reasonable limit.
In fact, to work both ways, my code is still full of constructs like
this:
#ifdef PROTOTYPE
int foo(char *bar, BOOL baz)
#else
int foo(bar, baz) char *bar; BOOL baz;
#endif
Le 08-01-2026, Lawrence DrCOOliveiro <ldo@nz.invalid> a |-crit-a:
On Wed, 7 Jan 2026 19:21:09 -0700, Peter Flass wrote:
FORTRAN and COBOL are still around, but I don't thinks anyone from
the 70s would recognize them.
COBOL is still COBOL. Fortran has evolved somewhat,
post-Fortran-77.
Is it really still the same COBOL?
I like to be a little more explicit, so I say "#ifdef DELETE_THIS".
On Sat, 10 Jan 2026 07:42:47 -0700, Peter Flass wrote:
It did if you lived in Doggerland, or used to walk from Australia to
Indonesia.
https://en.wikipedia.org/wiki/Stone_Spring
The rest of the trilogy, 'Bronze Summer' and 'Iron Winter', are okay but
the focus moves from Doggerland.
The Natural Philosopher <tnp@invalid.invalid> writes:
On 09/01/2026 21:24, John Ames wrote:
On 9 Jan 2026 20:36:38 GMTYes.
rbowman <bowman@montana.com> wrote:
That *is* an intriguing question - AFAIK the evidence we have is
scant, but it's certainly a fascinating notion. Dunno if we'll ever
get any solid answers, but you gotta wonder...
Heyerdahl was disliked by the academics but he had an embarrassing
habit of building boats and going places that shouldn't have been
reachable in their theories.
Certainly can't accuse him of not putting his money where his mouth was. >>>
Before Doggerland sank anybody could wander over without having to
build a coracle.
It's truly amazing how much of the world was walkable in the Ice Age;
doesn't explain *every* place humans ended up (it's absolutely mind-
boggling to consider how far back the Pacific islands were settled,)
but it absolutely made a whole lotta places readily accessible for a
good long while. Makes you wonder, too, how many of the various quasi-
Atlantean legends in northwest Europe are really mutated folk memory
from a *staggeringly* long time ago...
125m of sea level rise in a few thousand years...and a global
temperature rise of
up to 10-#C
Odd how that didn't 'destroy the planet'...
Apples are not equal to oranges.
On 1/10/26 12:44, rbowman wrote:
On Sat, 10 Jan 2026 07:42:47 -0700, Peter Flass wrote:
It did if you lived in Doggerland, or used to walk from Australia to
Indonesia.
https://en.wikipedia.org/wiki/Stone_Spring
The rest of the trilogy, 'Bronze Summer' and 'Iron Winter', are okay
but the focus moves from Doggerland.
I love a nice upbeat story.
On Sat, 10 Jan 2026 19:39:05 GMT, Charlie Gibbs wrote:
I like to be a little more explicit, so I say "#ifdef DELETE_THIS".
We have version control nowadays. You can actually delete stuff from
your source, and trust to the version history to keep a record of what
used to be there.
On Sat, 10 Jan 2026 19:39:05 GMT, Charlie Gibbs wrote:
In fact, to work both ways, my code is still full of constructs like
this:
#ifdef PROTOTYPE
int foo(char *bar, BOOL baz)
#else
int foo(bar, baz) char *bar; BOOL baz;
#endif
What a pain-in-the-bum way of writing things.
K&R C is gone, people. Let it go.
On 2026-01-09, c186282 <c186282@nnada.net> wrote:
'C' has added a few nicey-nice things, but not TOO much.
You can (I do) stick pretty much to K&R and everything
still works fine.
I think of my style as "K&R plus prototypes". In fact, to
work both ways, my code is still full of constructs like this:
#ifdef PROTOTYPE
int foo(char *bar, BOOL baz)
#else
int foo(bar, baz) char *bar; BOOL baz;
#endif
On Fri, 9 Jan 2026 10:00:20 +0000, The Natural Philosopher wrote:
On 09/01/2026 04:13, John Ames wrote:
They do now - but they had a different character once upon a time.It was other people before that. too. Celts are late invaders from the
England used to be a bunch of Celts and a handful of Roman expats 't
broinze age.
Before Doggerland sank anybody could wander over without having to build a coracle.
People should have the choice of writing that way if they want. And
you always have the choice of not reading it.
(If this sounds too harsh to somebody: I wrote this because of how
Lawrence repeatedly mentioned "choice" as a way to dismiss criticism
in other threads in comp.os.linux.misc.)
On 1/9/26 15:32, rbowman wrote:
On Fri, 9 Jan 2026 10:00:20 +0000, The Natural Philosopher wrote:
On 09/01/2026 04:13, John Ames wrote:
They do now - but they had a different character once upon a time.It was other people before that. too. Celts are late invaders from the
England used to be a bunch of Celts and a handful of Roman expats 't
broinze age.
Before Doggerland sank anybody could wander over without having to
build a
coracle.
-a Correct. However it mostly sank about 12,000 years
-a ago when all the ice melted. Even the Beaker People
-a had to float over to England.
-a Stick to my estimation that England perhaps ranks
-a as the "most invaded" country ever-a :-)
-a Original pop ? Who the fuck knows ?
On Sun, 11 Jan 2026 00:27:20 +0000
Nuno Silva <nunojsilva@invalid.invalid> wrote:
People should have the choice of writing that way if they want. And
you always have the choice of not reading it.
(If this sounds too harsh to somebody: I wrote this because of how
Lawrence repeatedly mentioned "choice" as a way to dismiss criticism
in other threads in comp.os.linux.misc.)
While I appreciate the zing, I do have to opine for the record that K&R function definitions really are something best left buried with K&R C.
I'm willing to write in ANSI C for the sake of whatever random weirdo
wants to try building something of mine on an old proprietary *nix, but
man I am *not* going any farther back than that.
On 1/10/26 18:51, c186282 wrote:
On 1/9/26 15:32, rbowman wrote:
On Fri, 9 Jan 2026 10:00:20 +0000, The Natural Philosopher wrote:
On 09/01/2026 04:13, John Ames wrote:
They do now - but they had a different character once upon a time.It was other people before that. too. Celts are late invaders from the >>>> broinze age.
England used to be a bunch of Celts and a handful of Roman expats 't
Before Doggerland sank anybody could wander over without having to
build a
coracle.
-a-a Correct. However it mostly sank about 12,000 years
-a-a ago when all the ice melted. Even the Beaker People
-a-a had to float over to England.
-a-a Stick to my estimation that England perhaps ranks
-a-a as the "most invaded" country ever-a :-)
-a-a Original pop ? Who the fuck knows ?
People just kept heading west, and when they got to England they had to stop.
On 1/10/26 19:41, John Ames wrote:
On Sun, 11 Jan 2026 00:27:20 +0000
Nuno Silva <nunojsilva@invalid.invalid> wrote:
People should have the choice of writing that way if they want. And
you always have the choice of not reading it.
(If this sounds too harsh to somebody: I wrote this because of how
Lawrence repeatedly mentioned "choice" as a way to dismiss criticism
in other threads in comp.os.linux.misc.)
While I appreciate the zing, I do have to opine for the record that K&R
function definitions really are something best left buried with K&R C.
I'm willing to write in ANSI C for the sake of whatever random weirdo
wants to try building something of mine on an old proprietary *nix, but
man I am *not* going any farther back than that.
It's never good to foreclose your options. One of my goals for Iron
Spring PL/I is compatibility with the widest base of code possible. It
can compile and run IBM PL/I(F) code from 1965. The newer stuff is
better, but rewriting something that works is a pain.
On 1/10/26 10:23, Scott Lurndal wrote:
The Natural Philosopher <tnp@invalid.invalid> writes:
On 09/01/2026 21:24, John Ames wrote:
On 9 Jan 2026 20:36:38 GMTYes.
rbowman <bowman@montana.com> wrote:
That *is* an intriguing question - AFAIK the evidence we have is
scant, but it's certainly a fascinating notion. Dunno if we'll ever >>>>>> get any solid answers, but you gotta wonder...
Heyerdahl was disliked by the academics but he had an embarrassing
habit of building boats and going places that shouldn't have been
reachable in their theories.
Certainly can't accuse him of not putting his money where his mouth
was.
Before Doggerland sank anybody could wander over without having to
build a coracle.
It's truly amazing how much of the world was walkable in the Ice Age;
doesn't explain *every* place humans ended up (it's absolutely mind-
boggling to consider how far back the Pacific islands were settled,)
but it absolutely made a whole lotta places readily accessible for a
good long while. Makes you wonder, too, how many of the various quasi- >>>> Atlantean legends in northwest Europe are really mutated folk memory
from a *staggeringly* long time ago...
125m of sea level rise in a few-a thousand years...and a global
temperature rise of
up to 10-#C
Odd how that didn't 'destroy the planet'...
Apples are not equal to oranges.
-a-a-a-aDon't worry about the planet.-a With or without life on it Earth will take care of itself just as does Venus or Mercury.-a The risk is
to the last few hundred years of human progress(?). We might
manage to revert to barbarism if the temperature does not go too
high for our systems by which I mean the whole means by which
your body maintains homeostasis which includes food systems,
medical systems, transport systems.-a I suspect without clear
evidence that we may hit another bottleneck and suffer large
losses of population and genetic diversity human and otherwise.
On 1/10/26 19:41, John Ames wrote:
On Sun, 11 Jan 2026 00:27:20 +0000
Nuno Silva <nunojsilva@invalid.invalid> wrote:
People should have the choice of writing that way if they want. And
you always have the choice of not reading it.
(If this sounds too harsh to somebody: I wrote this because of how
Lawrence repeatedly mentioned "choice" as a way to dismiss criticism
in other threads in comp.os.linux.misc.)
While I appreciate the zing, I do have to opine for the record that K&R
function definitions really are something best left buried with K&R C.
I'm willing to write in ANSI C for the sake of whatever random weirdo
wants to try building something of mine on an old proprietary *nix, but
man I am *not* going any farther back than that.
It's never good to foreclose your options. One of my goals for Iron
Spring PL/I is compatibility with the widest base of code possible. It
can compile and run IBM PL/I(F) code from 1965. The newer stuff is
better, but rewriting something that works is a pain.
On 1/10/26 18:51, c186282 wrote:
On 1/9/26 15:32, rbowman wrote:People just kept heading west, and when they got to England they had to
On Fri, 9 Jan 2026 10:00:20 +0000, The Natural Philosopher wrote:
On 09/01/2026 04:13, John Ames wrote:
They do now - but they had a different character once upon a time.It was other people before that. too. Celts are late invaders from
England used to be a bunch of Celts and a handful of Roman expats 't
the broinze age.
Before Doggerland sank anybody could wander over without having to
build a coracle.
-a Correct. However it mostly sank about 12,000 years ago when all the
-a ice melted. Even the Beaker People had to float over to England.
-a Stick to my estimation that England perhaps ranks as the "most
-a invaded" country ever-a :-)
-a Original pop ? Who the fuck knows ?
stop.
On Sat, 10 Jan 2026 20:03:27 -0700, Peter Flass wrote:
On 1/10/26 18:51, c186282 wrote:
On 1/9/26 15:32, rbowman wrote:People just kept heading west, and when they got to England they had to
On Fri, 9 Jan 2026 10:00:20 +0000, The Natural Philosopher wrote:
On 09/01/2026 04:13, John Ames wrote:
They do now - but they had a different character once upon a time. >>>>>> England used to be a bunch of Celts and a handful of Roman expats 't >>>>> It was other people before that. too. Celts are late invaders fromthe broinze age.
Before Doggerland sank anybody could wander over without having to
build a coracle.
-a Correct. However it mostly sank about 12,000 years ago when all the >>> -a ice melted. Even the Beaker People had to float over to England.
-a Stick to my estimation that England perhaps ranks as the "most
-a invaded" country ever-a :-)
-a Original pop ? Who the fuck knows ?
stop.
https://www.goodreads.com/book/show/2150867.Westviking https://en.wikipedia.org/wiki/The_Farfarers
No, you just build a boat. Mowat has been accused of having a vivid imagination particularly for 'Never Cry Wolf' but he does point out that
by island hopping in the Hebrides and Faroes before heading for Iceland
you are only out if sight of land for a couple of days, assuming you don't get blown off course.
He tells a plausible story. In 'Collapse' Jared Diamond claims that one of the reasons for the abandonment of Greenland along with climate change was
an irrational reluctance of the Norse to eat fish. Excuse me? He bases
that on the lack of fish bones in the middens. I've never had it but I
think the process of producing h|ikarl might dissolve the bones.
On Sun, 11 Jan 2026 00:27:20 +0000
Nuno Silva <nunojsilva@invalid.invalid> wrote:
People should have the choice of writing that way if they want. And
you always have the choice of not reading it.
(If this sounds too harsh to somebody: I wrote this because of how
Lawrence repeatedly mentioned "choice" as a way to dismiss criticism
in other threads in comp.os.linux.misc.)
While I appreciate the zing, I do have to opine for the record that K&R function definitions really are something best left buried with K&R C.
I'm willing to write in ANSI C for the sake of whatever random weirdo
wants to try building something of mine on an old proprietary *nix, but
man I am *not* going any farther back than that.
On 10 Jan 2026 14:40:30 GMT, St|-phane CARPENTIER wrote:
Le 08-01-2026, Lawrence DrCOOliveiro <ldo@nz.invalid> a |-crit-a:
On Wed, 7 Jan 2026 19:21:09 -0700, Peter Flass wrote:
FORTRAN and COBOL are still around, but I don't thinks anyone from
the 70s would recognize them.
COBOL is still COBOL. Fortran has evolved somewhat,
post-Fortran-77.
Is it really still the same COBOL?
I imagine itrCOs still backward-compatible.
My point being that the new stuff added to Fortran changes the
language out of all recognition (e.g. free-format source, user-defined
types, type parameters, CONTAINS), whereas the same is not true of
COBOL.
Look ... nobody is going to be 'writing' much of ANYTHING within five
years. The "AI" will do it all - probably led by the pointy-haired
bosses who can't find their ass even with a spy sat.
On Sun, 11 Jan 2026 01:52:02 -0500
c186282 <c186282@nnada.net> wrote:
Look ... nobody is going to be 'writing' much of ANYTHING within five
years. The "AI" will do it all - probably led by the pointy-haired
bosses who can't find their ass even with a spy sat.
The "AI" bubble isn't going to *last* another five years, full stop.
Frankly, I'll be shocked if it makes it to '28, if that.
You're not wrong that the PHBs would *love* to have a Magic Genie
Friend who answers their poorly-specified and unreasonable demands
without question, even if it doesn't actually *work* - but the current
trend of "throw as much raw compute at the same moronic Markov-chain
solution as possible, and somehow scrounge up more training data than
THE ENTIRE INTERNET" will collapse under its own weight *long* before
we ever get there.
From what I've read, even the Neanderthals knew how
to build at least crude boats - pushed out onto some
of the Greek islands.
So yea, modern humans carried on the practice. It got
them to England and beyond. Well, SOME of them ...
the death rate would have been rather high for any
long voyage.
Building GOOD, large-ish, properly steerable boats ...
THAT took much longer than expected. Seems easy now,
but for whatever reasons the ancients had a hard time
of it.
England ... NOT too far. Even crap boats would do it.
The Beaker People completely infiltrated the existing
English pop about 4400bc - but they'd HAVE to have
floated there. Clearly their boats were 'adequate',
and there'd have been a LOT of them.
On 1/10/26 03:27, The Natural Philosopher wrote:No it didn't. It destroyed doggerland. And as for walking to australia,
Odd how that didn't 'destroy the planet'...
It did if you lived in Doggerland, or used to walk from Australia to Indonesia.
Our ancestors survived global warming, ice ages, plagues, wars, and
all sorts of other problems, at least long enough to breed and pass
on the genes.
On 1/10/26 11:44, rbowman wrote:
On Sat, 10 Jan 2026 07:42:47 -0700, Peter Flass wrote:
It did if you lived in Doggerland, or used to walk from Australia
to Indonesia.
https://en.wikipedia.org/wiki/Stone_Spring
The rest of the trilogy, 'Bronze Summer' and 'Iron Winter', are
okay but the focus moves from Doggerland.
When Doggerland is submerged and the people have to leave it it
seems totally logical that the focus would change to ancientry.
Remember Doggerland was prehistoric so I cannot even say ancienty
history but whatever the author according to his education can
imagine of those times.
Worthwhile book in 'Stone Spring' in my ever so humble opinion
Bliss
Odd how that didn't 'destroy the planet'...Apples are not equal to oranges.
-a-a-a-aDon't worry about the planet.-a With or without life on it Earth will take care of itself just as does Venus or Mercury.-a The risk is
to the last few hundred years of human progress(?). We might
manage to revert to barbarism if the temperature does not go too
high for our systems by which I mean the whole means by which
your body maintains homeostasis which includes food systems,
medical systems, transport systems.-a I suspect without clear
evidence that we may hit another bottleneck and suffer large
losses of population and genetic diversity human and otherwise.
-a-a-a-abliss - always the cheery optimist...
The global climate has never gone "too hot" over
-a the past BILLION years.
-a However the "warm zone" has sometimes expanded to
-a reach the poles.
-a And sometimes contracted so there's icebergs at
-a the equator.
On 2026-01-09, c186282 <c186282@nnada.net> wrote:
'C' has added a few nicey-nice things, but not TOO much.
You can (I do) stick pretty much to K&R and everything
still works fine.
I think of my style as "K&R plus prototypes". In fact, to
work both ways, my code is still full of constructs like this:
#ifdef PROTOTYPE
int foo(char *bar, BOOL baz)
#else
int foo(bar, baz) char *bar; BOOL baz;
#endif
Stick to my estimation that England perhaps ranks
-a as the "most invaded" country ever-a EfOe
He tells a plausible story. In 'Collapse' Jared Diamond claims that one of the reasons for the abandonment of Greenland along with climate change was
an irrational reluctance of the Norse to eat fish. Excuse me? He bases
that on the lack of fish bones in the middens. I've never had it but I
think the process of producing h|ikarl might dissolve the bones.
From what I've read, even the Neanderthals knew how
-a to build at least crude boats - pushed out onto some
-a of the Greek islands.
-a So yea, modern humans carried on the practice. It got
-a them to England and beyond. Well, SOME of them ...
-a the death rate would have been rather high for any
-a long voyage.
-a Building GOOD, large-ish, properly steerable boats ...
-a THAT took much longer than expected. Seems easy now,
-a but for whatever reasons the ancients had a hard time
-a of it.
-a England ... NOT too far. Even crap boats would do it.
-a The Beaker People completely infiltrated the existing
-a English pop about 4400bc - but they'd HAVE to have
-a floated there. Clearly their boats were 'adequate',
-a and there'd have been a LOT of them.
My code tends to be like an ATV: it might not be pretty,
but it'll go anywhere.
1. Anything that works is better than anything that doesn't.
On 1/10/26 22:06, Peter Flass wrote:
On 1/10/26 19:41, John Ames wrote:
On Sun, 11 Jan 2026 00:27:20 +0000
Nuno Silva <nunojsilva@invalid.invalid> wrote:
People should have the choice of writing that way if they want. And
you always have the choice of not reading it.
(If this sounds too harsh to somebody: I wrote this because of how
Lawrence repeatedly mentioned "choice" as a way to dismiss criticism
in other threads in comp.os.linux.misc.)
While I appreciate the zing, I do have to opine for the record that K&R
function definitions really are something best left buried with K&R C.
I'm willing to write in ANSI C for the sake of whatever random weirdo
wants to try building something of mine on an old proprietary *nix, but
man I am *not* going any farther back than that.
It's never good to foreclose your options. One of my goals for Iron
Spring PL/I is compatibility with the widest base of code possible. It
can compile and run IBM PL/I(F) code from 1965. The newer stuff is
better, but rewriting something that works is a pain.
-a You got Iron Spring to run properly ?
On Sat, 10 Jan 2026 20:03:27 -0700, Peter Flass wrote:
On 1/10/26 18:51, c186282 wrote:
On 1/9/26 15:32, rbowman wrote:People just kept heading west, and when they got to England they had to
On Fri, 9 Jan 2026 10:00:20 +0000, The Natural Philosopher wrote:
On 09/01/2026 04:13, John Ames wrote:
They do now - but they had a different character once upon a time. >>>>>> England used to be a bunch of Celts and a handful of Roman expats 't >>>>> It was other people before that. too. Celts are late invaders fromthe broinze age.
Before Doggerland sank anybody could wander over without having to
build a coracle.
-a Correct. However it mostly sank about 12,000 years ago when all the >>> -a ice melted. Even the Beaker People had to float over to England.
-a Stick to my estimation that England perhaps ranks as the "most
-a invaded" country ever-a :-)
-a Original pop ? Who the fuck knows ?
stop.
https://www.goodreads.com/book/show/2150867.Westviking https://en.wikipedia.org/wiki/The_Farfarers
No, you just build a boat. Mowat has been accused of having a vivid imagination particularly for 'Never Cry Wolf' but he does point out that
by island hopping in the Hebrides and Faroes before heading for Iceland
you are only out if sight of land for a couple of days, assuming you don't get blown off course.
He tells a plausible story. In 'Collapse' Jared Diamond claims that one of the reasons for the abandonment of Greenland along with climate change was
an irrational reluctance of the Norse to eat fish. Excuse me? He bases
that on the lack of fish bones in the middens. I've never had it but I
think the process of producing h|ikarl might dissolve the bones.
On 1/10/26 21:41, John Ames wrote:
On Sun, 11 Jan 2026 00:27:20 +0000
Nuno Silva <nunojsilva@invalid.invalid> wrote:
People should have the choice of writing that way if they want. And
you always have the choice of not reading it.
(If this sounds too harsh to somebody: I wrote this because of how
Lawrence repeatedly mentioned "choice" as a way to dismiss criticism
in other threads in comp.os.linux.misc.)
While I appreciate the zing, I do have to opine for the record that K&R
function definitions really are something best left buried with K&R C.
I'm willing to write in ANSI C for the sake of whatever random weirdo
wants to try building something of mine on an old proprietary *nix, but
man I am *not* going any farther back than that.
-a Look ... nobody is going to be 'writing' much
-a of ANYTHING within five years. The "AI" will do
-a it all - probably led by the pointy-haired bosses
-a who can't find their ass even with a spy sat.
-a And Win/Lin/IX ... I think they're going to go
-a away as well. It'll all just be thin clients
-a plugged into the leading AI engines. No more
-a operating systems.
-a Maybe PIs ... maybe.
-a "Programming" is going to be like those who learn
-a to play ancient Greek musical instruments ... an
-a interesting, but obsolete, old art. "AI" for worse
-a or worser, will be IT. Many TRILLIONS of dollars
-a invested in this - it is GOING to be The Future
-a whether we like it or not.
-a Just sayin'
On Sun, 11 Jan 2026 01:52:02 -0500
c186282 <c186282@nnada.net> wrote:
Look ... nobody is going to be 'writing' much of ANYTHING within five
years. The "AI" will do it all - probably led by the pointy-haired
bosses who can't find their ass even with a spy sat.
The "AI" bubble isn't going to *last* another five years, full stop.
Frankly, I'll be shocked if it makes it to '28, if that.
On 1/11/26 11:55, Harold Stevens wrote:
1. Anything that works is better than anything that doesn't.
I think there exists a lot of code which makes the world a worse
place, and hence it would be better if it didn't work.
On Sat, 10 Jan 2026 20:03:27 -0700, Peter Flass wrote:
People just kept heading west, and when they got to England they had to
stop.
He tells a plausible story. In 'Collapse' Jared Diamond claims that one of >the reasons for the abandonment of Greenland along with climate change was >an irrational reluctance of the Norse to eat fish. Excuse me? He bases
that on the lack of fish bones in the middens. I've never had it but I
think the process of producing h|ikarl might dissolve the bones.
On 10/01/2026 18:23, Scott Lurndal wrote:
Odd how that didn't 'destroy the planet'...Apples are not equal to oranges.
What a meaningless statement.
FORTRAN ... it remains 'important', esp in academic
and professional circles. Can NOT beat all the
engineering/physics libs/functions writ for FORTRAN
over the years ... a solution for EVERYTHING complex.
It's not "popular" like Python ... but it's NOT going
to go away anytime soon. A 'niche' lang, but it's an
important niche.
On 11/01/2026 05:39, rbowman wrote:
He tells a plausible story. In 'Collapse' Jared Diamond claims that one of >> the reasons for the abandonment of Greenland along with climate change was >> an irrational reluctance of the Norse to eat fish. Excuse me? He bases
that on the lack of fish bones in the middens. I've never had it but I
think the process of producing h|ikarl might dissolve the bones.
They are probably so hungry they ate the bones as well..
The 'Norse' grew up on fish. One visit to sweden or Denmark will show
1001 ways to prepare 'herring'
A lot less pork chicken and beef on the menu.
The Natural Philosopher <tnp@invalid.invalid> writes:
On 10/01/2026 18:23, Scott Lurndal wrote:
Odd how that didn't 'destroy the planet'...Apples are not equal to oranges.
What a meaningless statement.
Not in the context of the portion of the post you
so conveniently deleted.
On 2026-01-11, The Natural Philosopher <tnp@invalid.invalid> wrote:
On 11/01/2026 05:39, rbowman wrote:
He tells a plausible story. In 'Collapse' Jared Diamond claims that one of >>> the reasons for the abandonment of Greenland along with climate change was >>> an irrational reluctance of the Norse to eat fish. Excuse me? He bases
that on the lack of fish bones in the middens. I've never had it but I
think the process of producing h|ikarl might dissolve the bones.
They are probably so hungry they ate the bones as well..
The 'Norse' grew up on fish. One visit to sweden or Denmark will show
1001 ways to prepare 'herring'
A lot less pork chicken and beef on the menu.
You exaggerate. Sure, fish is _a_ cornerstone in our cuisine, but only
one. I would not say there is a _lot_ less pork, chicken and beef.
Personally I don't eat fish very often, and neither do most people I
know.
Niklas
On 11/01/2026 17:44, Niklas Karlsson wrote:
On 2026-01-11, The Natural Philosopher <tnp@invalid.invalid> wrote:
On 11/01/2026 05:39, rbowman wrote:
He tells a plausible story. In 'Collapse' Jared Diamond claims that one of >>>> the reasons for the abandonment of Greenland along with climate change was >>>> an irrational reluctance of the Norse to eat fish. Excuse me? He bases >>>> that on the lack of fish bones in the middens. I've never had it but I >>>> think the process of producing h|ikarl might dissolve the bones.
They are probably so hungry they ate the bones as well..
The 'Norse' grew up on fish. One visit to sweden or Denmark will show
1001 ways to prepare 'herring'
A lot less pork chicken and beef on the menu.
You exaggerate. Sure, fish is _a_ cornerstone in our cuisine, but only
one. I would not say there is a _lot_ less pork, chicken and beef.
Personally I don't eat fish very often, and neither do most people I
know.
Niklas
Well the point being that Norse nations are well able to survive on fish
if they have to.
"FORTRAN ... remains popular among engineers but despised elsewhere."
On 2026-01-11, The Natural Philosopher <tnp@invalid.invalid> wrote:The Norse greenlanders were never huge in number and the natives knew
On 11/01/2026 17:44, Niklas Karlsson wrote:
On 2026-01-11, The Natural Philosopher <tnp@invalid.invalid> wrote:
On 11/01/2026 05:39, rbowman wrote:
He tells a plausible story. In 'Collapse' Jared Diamond claims that one of
the reasons for the abandonment of Greenland along with climate change was
an irrational reluctance of the Norse to eat fish. Excuse me? He bases >>>>> that on the lack of fish bones in the middens. I've never had it but I >>>>> think the process of producing h|ikarl might dissolve the bones.
They are probably so hungry they ate the bones as well..
The 'Norse' grew up on fish. One visit to sweden or Denmark will show
1001 ways to prepare 'herring'
A lot less pork chicken and beef on the menu.
You exaggerate. Sure, fish is _a_ cornerstone in our cuisine, but only
one. I would not say there is a _lot_ less pork, chicken and beef.
Personally I don't eat fish very often, and neither do most people I
know.
Niklas
Well the point being that Norse nations are well able to survive on fish
if they have to.
That I'll agree with... though I'm not sure how sustainable the level of fishing would be that we'd have to do if fish and maybe shellfish were
our only protein.
Niklas--
Greybeard quants like me operated on 3 simple maxims:
1. Anything that works is better than anything that doesn't.
2. If if ain't broke, don't fix it.
3. If it breaks, don't ignore it.
On 11/01/2026 05:39, rbowman wrote:
He tells a plausible story. In 'Collapse' Jared Diamond claims that one
of the reasons for the abandonment of Greenland along with climate
change was an irrational reluctance of the Norse to eat fish. Excuse
me? He bases that on the lack of fish bones in the middens. I've never
had it but I think the process of producing h|ikarl might dissolve the
bones.
They are probably so hungry they ate the bones as well..
On 11/01/2026 20:23, Niklas Karlsson wrote:
On 2026-01-11, The Natural Philosopher <tnp@invalid.invalid> wrote:The Norse greenlanders were never huge in number and the natives knew
Well the point being that Norse nations are well able to survive on fish >>> if they have to.
That I'll agree with... though I'm not sure how sustainable the level of
fishing would be that we'd have to do if fish and maybe shellfish were
our only protein.
how to fish.
I suspect the Norse said 'fuck this lets go home' and abandoned
greenland as being not worth the effort.
FORTRAN ... it remains 'important', esp in academic and professional
circles. Can NOT beat all the engineering/physics libs/functions writ
for FORTRAN over the years ... a solution for EVERYTHING complex.
It's not "popular" like Python ... but it's NOT going to go away
anytime soon. A 'niche' lang, but it's an important niche.
rbowman <bowman@montana.com> writes:
On Sat, 10 Jan 2026 20:03:27 -0700, Peter Flass wrote:
People just kept heading west, and when they got to England they had
to stop.
He tells a plausible story. In 'Collapse' Jared Diamond claims that one
of the reasons for the abandonment of Greenland along with climate
change was an irrational reluctance of the Norse to eat fish. Excuse me?
He bases that on the lack of fish bones in the middens. I've never had
it but I think the process of producing h|ikarl might dissolve the bones.
One word. Lutefisk.
On 11/01/2026 01:51, c186282 wrote:
Stick to my estimation that England perhaps ranks
-a as the "most invaded" country ever-a EfOe
Yes, until 1066, after which it became the least.
Nothing like having a navy comprised of pirates.
On Sun, 11 Jan 2026 16:44:55 GMT, Scott Lurndal wrote:
rbowman <bowman@montana.com> writes:
On Sat, 10 Jan 2026 20:03:27 -0700, Peter Flass wrote:
People just kept heading west, and when they got to England they had
to stop.
He tells a plausible story. In 'Collapse' Jared Diamond claims that one >>>of the reasons for the abandonment of Greenland along with climate
change was an irrational reluctance of the Norse to eat fish. Excuse me? >>>He bases that on the lack of fish bones in the middens. I've never had
it but I think the process of producing h|ikarl might dissolve the bones.
One word. Lutefisk.
Butter, lots of butter. Big problem if the cows died off and there was no butter. It shows up around here at Christmas time. I've been told by knowledgeable people Norwegians in the US eat lutefisk and Norwegians in Norway eat frozen pizza.
Archaelogy has brought mots of human 'prehistory' into the class of
'fairly well known history'
On 2026-01-11, The Natural Philosopher <tnp@invalid.invalid> wrote:
On 11/01/2026 17:44, Niklas Karlsson wrote:
On 2026-01-11, The Natural Philosopher <tnp@invalid.invalid> wrote:
On 11/01/2026 05:39, rbowman wrote:
He tells a plausible story. In 'Collapse' Jared Diamond claims that one of
the reasons for the abandonment of Greenland along with climate change was
an irrational reluctance of the Norse to eat fish. Excuse me? He bases >>>>> that on the lack of fish bones in the middens. I've never had it but I >>>>> think the process of producing h|ikarl might dissolve the bones.
They are probably so hungry they ate the bones as well..
The 'Norse' grew up on fish. One visit to sweden or Denmark will show
1001 ways to prepare 'herring'
A lot less pork chicken and beef on the menu.
You exaggerate. Sure, fish is _a_ cornerstone in our cuisine, but only
one. I would not say there is a _lot_ less pork, chicken and beef.
Personally I don't eat fish very often, and neither do most people I
know.
Niklas
Well the point being that Norse nations are well able to survive on fish
if they have to.
That I'll agree with... though I'm not sure how sustainable the level of fishing would be that we'd have to do if fish and maybe shellfish were
our only protein.
On Sun, 11 Jan 2026 16:55:32 GMT, Charlie Gibbs wrote:
"FORTRAN ... remains popular among engineers but despised elsewhere."
Considering its enduring popularity among the supercomputing crowd,
IrCOd say that assessment is a bit out of date.
On 11/01/2026 20:23, Niklas Karlsson wrote:
On 2026-01-11, The Natural Philosopher <tnp@invalid.invalid> wrote:The Norse greenlanders were never huge in number and the natives knew
On 11/01/2026 17:44, Niklas Karlsson wrote:
On 2026-01-11, The Natural Philosopher <tnp@invalid.invalid> wrote:
On 11/01/2026 05:39, rbowman wrote:
He tells a plausible story. In 'Collapse' Jared Diamond claims
that one of
the reasons for the abandonment of Greenland along with climate
change was
an irrational reluctance of the Norse to eat fish. Excuse me? He
bases
that on the lack of fish bones in the middens. I've never had it
but I
think the process of producing h|ikarl might dissolve the bones.
They are probably so hungry they ate the bones as well..
The 'Norse' grew up on fish. One visit to sweden or Denmark will show >>>>> 1001 ways to prepare 'herring'
A lot less pork chicken and beef on the menu.
You exaggerate. Sure, fish is _a_ cornerstone in our cuisine, but only >>>> one. I would not say there is a _lot_ less pork, chicken and beef.
Personally I don't eat fish very often, and neither do most people I
know.
Niklas
Well the point being that Norse nations are well able to survive on fish >>> if they have to.
That I'll agree with... though I'm not sure how sustainable the level of
fishing would be that we'd have to do if fish and maybe shellfish were
our only protein.
how to fish.
I suspect the Norse said 'fuck this lets go home' and abandoned
greenland as being not worth the effort.
On Sun, 11 Jan 2026 16:44:55 GMT, Scott Lurndal wrote:
rbowman <bowman@montana.com> writes:
On Sat, 10 Jan 2026 20:03:27 -0700, Peter Flass wrote:
People just kept heading west, and when they got to England they had
to stop.
He tells a plausible story. In 'Collapse' Jared Diamond claims that one >>>of the reasons for the abandonment of Greenland along with climate
change was an irrational reluctance of the Norse to eat fish. Excuse me? >>>He bases that on the lack of fish bones in the middens. I've never had
it but I think the process of producing h|ikarl might dissolve the bones.
One word. Lutefisk.
Butter, lots of butter. Big problem if the cows died off and there was no >butter. It shows up around here at Christmas time. I've been told by >knowledgeable people Norwegians in the US eat lutefisk and Norwegians in >Norway eat frozen pizza.
On Sun, 11 Jan 2026 11:05:32 +0000, The Natural Philosopher wrote:
Archaelogy has brought mots of human 'prehistory' into the class of
'fairly well known history'
With caveats. There have been many moments of 'oops, that stuff is a hell
of a lot older than we thought it was.' Even Chris Stringer had to change
his story although the popular conception is lagging.
https://en.wikipedia.org/wiki/Milford_H._Wolpoff
Le 11-01-2026, John Ames <commodorejohn@gmail.com> a |-crit-a:
On Sun, 11 Jan 2026 01:52:02 -0500
c186282 <c186282@nnada.net> wrote:
Look ... nobody is going to be 'writing' much of ANYTHING within five
years. The "AI" will do it all - probably led by the pointy-haired
bosses who can't find their ass even with a spy sat.
The "AI" bubble isn't going to *last* another five years, full stop.
Frankly, I'll be shocked if it makes it to '28, if that.
The AI bubble which must be considered (I mean: a lot of people has a
lot of understanding of that term) is that big companies will stop
to invest always more money on it. That doesn't mean the data centers
will stop to work, it means that new data centers will stop to be build.
At least at such increasing speed. It doesn't mean that actual GPU will
cease to work. It means that big companies will stop to buy so many GPU.
So, globally, everything that has already be done will stay. And new
things will improve at a slower pace. The AI is there and will stay
there for a long time. When the AI bubble will burst, its impact will
be more on the global economy than on its usage.
That's why the companies invest so much. Because the one which will
have the lead at the time of the burst expects to keep that lead for a
long time. Not because they are stupid and can't predict the obvious.
The AI is there, like it or not, you have to live with it. The fact that
you or I like it or not is irrelevant. Like when Platon was criticizing writing system because people stopped to learn by heart, the writing
system was there, stay through the ages and revolutionised things. There
are some things like farming, writing, electricity that changed
everything on the human way of life. And the AI is one one them. There
is no going back. I'm not saying that it's good or bad, I'm saying that
it's the evolution (not progress because progress is good by definition)
and one can't do anything but live with it.
Many stocks of fish are already depleted or nearly so,
and that's just at CURRENT levels of consumption. The "bounty of the
sea" is NOT unlimited, not at all.
http://linuxmafia.com/humour/power-of-lutefisk.html
The only good thing about lutefisk is that it is generally accompanied
by meatballs and mashed potatoes (and lefse).
On Sun, 11 Jan 2026 05:55:12 -0600, Harold Stevens wrote:
Greybeard quants like me operated on 3 simple maxims:
1. Anything that works is better than anything that doesn't.
2. If if ain't broke, don't fix it.
3. If it breaks, don't ignore it.
Those go way beyond programming...
On Sun, 11 Jan 2026 11:29:53 +0000, The Natural Philosopher wrote:
On 11/01/2026 05:39, rbowman wrote:
He tells a plausible story. In 'Collapse' Jared Diamond claims that one
of the reasons for the abandonment of Greenland along with climate
change was an irrational reluctance of the Norse to eat fish. Excuse
me? He bases that on the lack of fish bones in the middens. I've never
had it but I think the process of producing h|ikarl might dissolve the
bones.
They are probably so hungry they ate the bones as well..
We used to have fried smelts, fins, tail, and scales, usually without the head. This isn't the best area for seafood but the only ones I've seen in the market lately were marked as bait.
On Sun, 11 Jan 2026 11:26:50 +0000, The Natural Philosopher wrote:
On 11/01/2026 01:51, c186282 wrote:
Stick to my estimation that England perhaps ranks
-a as the "most invaded" country ever-a EfOe
Yes, until 1066, after which it became the least.
Nothing like having a navy comprised of pirates.
And a merchant class comprised of pirates... Wasn't there a Monty Python sketch about that?
-a Alas without detailed records we may find old THINGS,
-a but what they MEANT, their context, is forever lost.
-a That's just half a view of 'history'.
On Wed, 7 Jan 2026 13:30:14 +0100, Carlos E.R. wrote:
On 2026-01-06 19:57, Charlie Gibbs wrote:
On 2026-01-06, Lars Poulsen <lars@beagle-ears.com> wrote:
On 2026-01-06, Carlos E.R. <robin_listas@es.invalid> wrote:
My C teacher said it was a mistake to use C as an all purpose
language, like for userland applications. Using C is the cause of
many bugs that a proper language would catch.
That was around 1991.
He knew. He participated in some study tasked by the Canadian
government to study C compilers, but he could not talk about what
they wrote.
What language(s) did he suggest instead?
I don't remember if he did. Maybe he told samples, but I think he mostly
told us of quirks of the language, things that were errors, but that the
compiler did not signal, so that we being aware we would write correct C
code.
It is possible that current C compilers signal many more problems that
back then, but not runtime errors.
gcc has become pickier. That isn't always a welcome thing when working
with legacy code and requires a search of the compiler options to get it
to shut up about such horrible heresies as assuming a function returns an int.
Now, concerning 'AI is there', there was significant progress in
some areas like machine translation. "Creative writers" may be
concerned. But there were attempts to replace a lot of professionals, notably programmers. Examples indicate that "AI" can create
small, trival pieces of code but does not really work for
bigger and more complex things. To be useful for programming "AI"
and way it is used must be significantly improved.
On 11/01/2026 20:44, rbowman wrote:
On Sun, 11 Jan 2026 05:55:12 -0600, Harold Stevens wrote:Part of the 'philosophy of engineering'.
Greybeard quants like me operated on 3 simple maxims:
1. Anything that works is better than anything that doesn't.
2. If if ain't broke, don't fix it.
3. If it breaks, don't ignore it.
Those go way beyond programming...
Perhaps the most fundamental one, after 'an engineer is someone who can
do for five bob what any damn fool can do for a quid' is
'In the construction of mechanisms, complexity should not be multiplied beyond that necessary to achieve the defined objective'
Ockham's Laser...
On Mon, 12 Jan 2026 00:47:29 GMT, Scott Lurndal wrote:
http://linuxmafia.com/humour/power-of-lutefisk.html
The only good thing about lutefisk is that it is generally accompanied
by meatballs and mashed potatoes (and lefse).
Is isn't that bad. That's not to say it's good.
It's blandly neutral.
Butter, lots of butter. Big problem if the cows died off and there
was no butter. It shows up around here at Christmas time. I've
been told by knowledgeable people Norwegians in the US eat lutefisk
and Norwegians in Norway eat frozen pizza.
I am not that familiar with that aspect of our neighbors, but I can
believe it. We have lutfisk (yes, we spell it without the E) and I
certainly don't care for it. Fortunately, very rarely has anyone
attempted to serve it to me.
They did have gjetost, which makes up for it. The stuff is dangerous though.
https://www.newsinenglish.no/2013/01/22/burning-brown-cheese-closes-
tunnel/
The Ski Queen brand must not be the real thing. It doesn't burn.
On 1/12/26 04:49, The Natural Philosopher wrote:
On 11/01/2026 20:44, rbowman wrote:
On Sun, 11 Jan 2026 05:55:12 -0600, Harold Stevens wrote:
Greybeard quants like me operated on 3 simple maxims:
1. Anything that works is better than anything that doesn't.
2. If if ain't broke, don't fix it.
3. If it breaks, don't ignore it.
Those go way beyond programming...
Part of the 'philosophy of engineering'.
Perhaps the most fundamental one, after 'an engineer is someone who can
do for five bob what any damn fool can do for a quid' is
'In the construction of mechanisms, complexity should not be multiplied
beyond that necessary to achieve the defined objective'
Ockham's Laser...
Now if only computer people could follow this rule. Our rule seems to be "why not add just this one more feature"
Now, concerning burst: AFAIK AI companies use investment money to
cover cost of operation (whole or in significant part). If there
is burst, they will have to stop operating literally closing
their datacenters. Basically only things that generate profits
or possibly some research by companies that have other sources of
income and still want to continue research. But that would be
at much lower scale than currently.
But there were attempts to replace a lot of professionals,
notably programmers. Examples indicate that "AI" can create
small, trival pieces of code but does not really work for
bigger and more complex things. To be useful for programming "AI"
and way it is used must be significantly improved. It is possible
that slower, gradual improvement will lead to "useful AI".
But it is also possible that alternative approaches, currently
underfunded due to AI race, will progress and be used insted
of "AI".
[...]
So, it looks that for general AI we are missing something
important. For applications ANN apparently struggle with
tasks that have easy algorithmic solution. So natural way
forward with applications seem to be via hybrid approaches.
But AI crowd seem to prefer pure ANN solutions and tries
to brute-force problems using more compute power.
On 12 Jan 2026 04:10:10 GMT rbowman <bowman@montana.com> wrote:
They did have gjetost, which makes up for it. The stuff is dangerous
though.
https://www.newsinenglish.no/2013/01/22/burning-brown-cheese-closes-
tunnel/
The Ski Queen brand must not be the real thing. It doesn't burn.
Gosh, I'd forgotten about gjetost. Need to get some of that again.
On 11/01/2026 20:57, rbowman wrote:
On Sun, 11 Jan 2026 11:29:53 +0000, The Natural Philosopher wrote:In the UK 'whitebait' are fried fish eaten whole...
On 11/01/2026 05:39, rbowman wrote:
He tells a plausible story. In 'Collapse' Jared Diamond claims that
one of the reasons for the abandonment of Greenland along with
climate change was an irrational reluctance of the Norse to eat fish.
Excuse me? He bases that on the lack of fish bones in the middens.
I've never had it but I think the process of producing h|ikarl might
dissolve the bones.
They are probably so hungry they ate the bones as well..
We used to have fried smelts, fins, tail, and scales, usually without
the head. This isn't the best area for seafood but the only ones I've
seen in the market lately were marked as bait.
On 1/11/26 19:19, c186282 wrote:
-a Alas without detailed records we may find old THINGS,"It's a ritual object."
-a but what they MEANT, their context, is forever lost. That's just
-a half a view of 'history'.
On 11 Jan 2026 21:38:00 GMT Niklas Karlsson <nikke.karlsson@gmail.com>
wrote:
Butter, lots of butter. Big problem if the cows died off and there
was no butter. It shows up around here at Christmas time. I've been
told by knowledgeable people Norwegians in the US eat lutefisk and
Norwegians in Norway eat frozen pizza.
I am not that familiar with that aspect of our neighbors, but I can
believe it. We have lutfisk (yes, we spell it without the E) and I
certainly don't care for it. Fortunately, very rarely has anyone
attempted to serve it to me.
It's Considered Traditional among the older generations of Norwegian- Americans, to the point where you can find it in the grocery store in
the northern Midwest. Have never tried it myself, but I've seen (and
smelled) it at family gatherings.
Now krumkake, *that's* a slice of the Ol d Country I can get behind.
rbowman <bowman@montana.com> writes:pudding/
On Mon, 12 Jan 2026 00:47:29 GMT, Scott Lurndal wrote:
http://linuxmafia.com/humour/power-of-lutefisk.html
The only good thing about lutefisk is that it is generally accompanied
by meatballs and mashed potatoes (and lefse).
Is isn't that bad. That's not to say it's good.
It's blandly neutral.
He says as the gelatinous fishy slime slides down his throat :-)
We had it twice a year for decades. Yes, butter helps to mask
the the flavor, but nothing masks the consistency (or lack thereof).
Desert (Rommegrot) was good, if not particularly healthy:
https://www.cheaprecipeblog.com/2015/04/rommegrot-norwegian-cream-
If the code is not mine, I would use the compiler options instead.
Unless I got paid to maintain that code, then I would correct the code.
On Mon, 12 Jan 2026 08:11:36 -0800, John Ames wrote:
A friend who was active in a Norway based church told me a lot of the Sons >are really German. It's a nice clubhouse so why build your own when you
can invade Norway?
Now if only computer people could follow this rule. Our rule seems
to be "why not add just this one more feature"
To be fair, I'm sure a lot of computer people are doing this under
duress, being ordered by the marketroids (who have the ear of
management) to add yet another shiny thing.
If the code were mine, I would correct the code. Even back then, I
did not take the assumption that a function would return an integer
:-D
On Mon, 12 Jan 2026 17:03:19 GMT, Charlie Gibbs wrote:
To be fair, I'm sure a lot of computer people are doing this under
duress, being ordered by the marketroids (who have the ear of
management) to add yet another shiny thing.
ArenrCOt you glad the Free Software world isnrCOt driven by marketroids?
if they can only
get a fraction of their userbase to pay $200/mo. for a Magical Chatbot Friend, good freakin' luck squeezing any more blood from *that* turnip.
Lawrence =?iso-8859-13?q?D=FFOliveiro?= <ldo@nz.invalid> writes:
ArenrCOt you glad the Free Software world isnrCOt driven by marketroids?
It's not?
AI Overview
https://upload.wikimedia.org/wikipedia/commons/e/e2/Eric_S_Raymond_portrait.jpg
Eric S. Raymond (ESR), the well-known open-source advocate, began charging
speaking fees for corporate events in 1999 but waives fees for schools and
user groups; however, specific current fee amounts aren't publicly listed,
requiring direct contact with booking agents or his website, though general
estimates for similar speakers suggest fees could range from thousands to
tens of thousands depending on the event and his involvement.
On Mon, 12 Jan 2026 08:11:36 -0800, John Ames wrote:
On 11 Jan 2026 21:38:00 GMT Niklas Karlsson <nikke.karlsson@gmail.com>
wrote:
Butter, lots of butter. Big problem if the cows died off and there
was no butter. It shows up around here at Christmas time. I've been
told by knowledgeable people Norwegians in the US eat lutefisk and
Norwegians in Norway eat frozen pizza.
I am not that familiar with that aspect of our neighbors, but I can
believe it. We have lutfisk (yes, we spell it without the E) and I
certainly don't care for it. Fortunately, very rarely has anyone
attempted to serve it to me.
It's Considered Traditional among the older generations of Norwegian-
Americans, to the point where you can find it in the grocery store in
the northern Midwest. Have never tried it myself, but I've seen (and
smelled) it at family gatherings.
Now krumkake, *that's* a slice of the Ol d Country I can get behind.
It appears in the grocery stores here around Christmas.
https://www.sofn.com/norwegian_culture/recipe_box/ baked_goods_breads_and_desserts/rosettes/
The local Sons of Norway lodge has a booth at the fair where they sell 'vikings' and rosettes. The rosettes are good. The vikings are deep fried mystery meat on a stick sort of like a corndog. They're okay. The problem
is there is usually a long line.
https://www.sofnmissoula.com/
A friend who was active in a Norway based church told me a lot of the Sons are really German. It's a nice clubhouse so why build your own when you
can invade Norway?
He says as the gelatinous fishy slime slides down his throat EfOe
We had it twice a year for decades. Yes, butter helps to mask
the the flavor, but nothing masks the consistency (or lack thereof).
On 1/12/26 10:14, John Ames wrote:
if they can only
get a fraction of their userbase to pay $200/mo. for a Magical Chatbot
Friend, good freakin' luck squeezing any more blood from *that* turnip.
Make it a s*xbot, and all the incels will pay to imagine they have a girlfriend.
On Mon, 12 Jan 2026 07:45:11 -0700, Peter Flass wrote:
On 1/11/26 19:19, c186282 wrote:
-a Alas without detailed records we may find old THINGS,"It's a ritual object."
-a but what they MEANT, their context, is forever lost. That's just
-a half a view of 'history'.
I've heard some fascinating explanations for the petroglyphs in the US
west. My personal explanation is the tribe sent bored teenagers up to a lookout where lacking cellphones they chipped away at the rocks.
rbowman <bowman@montana.com> writes:
On Mon, 12 Jan 2026 07:45:11 -0700, Peter Flass wrote:
On 1/11/26 19:19, c186282 wrote:
-a Alas without detailed records we may find old THINGS,"It's a ritual object."
-a but what they MEANT, their context, is forever lost. That's just
-a half a view of 'history'.
I've heard some fascinating explanations for the petroglyphs in the US
west. My personal explanation is the tribe sent bored teenagers up to a
lookout where lacking cellphones they chipped away at the rocks.
You may not be that far off. Have a read of _The Nature Of Paelolithic
Art_ (R. Dale Guthrie) - itrCOs not short but if yourCOre interested in that sort of thing, itrCOd be time well spent.
https://press.uchicago.edu/Misc/Chicago/311260.html has a copy of the introduction.
Oh, now *that* looks like a good read. Many thanks, will definitelyAlas without detailed records we may find old THINGS, but what
they MEANT, their context, is forever lost. That's just half a
view of 'history'.
"It's a ritual object."
I've heard some fascinating explanations for the petroglyphs in the
US west. My personal explanation is the tribe sent bored teenagers
up to a lookout where lacking cellphones they chipped away at the
rocks.
You may not be that far off. Have a read of _The Nature Of Paelolithic
Art_ (R. Dale Guthrie) - itrCOs not short but if yourCOre interested in
that sort of thing, itrCOd be time well spent.
On 1/12/26 11:45, rbowman wrote:
On Mon, 12 Jan 2026 08:11:36 -0800, John Ames wrote:
On 11 Jan 2026 21:38:00 GMT Niklas Karlsson <nikke.karlsson@gmail.com>
wrote:
Butter, lots of butter.-a Big problem if the cows died off and there >>>>> was no butter.-a It shows up around here at Christmas time. I've been >>>>> told by knowledgeable people Norwegians in the US eat lutefisk and
Norwegians in Norway eat frozen pizza.
I am not that familiar with that aspect of our neighbors, but I can
believe it. We have lutfisk (yes, we spell it without the E) and I
certainly don't care for it. Fortunately, very rarely has anyone
attempted to serve it to me.
It's Considered Traditional among the older generations of Norwegian-
Americans, to the point where you can find it in the grocery store in
the northern Midwest. Have never tried it myself, but I've seen (and
smelled) it at family gatherings.
Now krumkake, *that's* a slice of the Ol d Country I can get behind.
It appears in the grocery stores here around Christmas.
https://www.sofn.com/norwegian_culture/recipe_box/
baked_goods_breads_and_desserts/rosettes/
The local Sons of Norway lodge has a booth at the fair where they sell
'vikings' and rosettes. The rosettes are good. The vikings are deep fried
mystery meat on a stick sort of like a corndog. They're okay. The problem
is there is usually a long line.
https://www.sofnmissoula.com/
A friend who was active in a Norway based church told me a lot of the
Sons
are really German. It's a nice clubhouse so why build your own when you
can invade Norway?
Garrison Keillor had a nice take on Norwegians vs. Germans in Lake
Woebegone
On 12/01/2026 15:44, Scott Lurndal wrote:
-a-a He says as the gelatinous fishy slime slides down his throat EfOe
-a-a We had it twice a year for decades.-a Yes, butter helps to mask
the the flavor, but nothing masks the consistency (or lack thereof).
Oysters: "like swallowing someone else's cold snot"
rbowman <bowman@montana.com> writes:
On Mon, 12 Jan 2026 07:45:11 -0700, Peter Flass wrote:
On 1/11/26 19:19, c186282 wrote:
-a Alas without detailed records we may find old THINGS,"It's a ritual object."
-a but what they MEANT, their context, is forever lost. That's just
-a half a view of 'history'.
I've heard some fascinating explanations for the petroglyphs in the US
west. My personal explanation is the tribe sent bored teenagers up to a
lookout where lacking cellphones they chipped away at the rocks.
You may not be that far off. Have a read of _The Nature Of Paelolithic
Art_ (R. Dale Guthrie) - itrCOs not short but if yourCOre interested in that sort of thing, itrCOd be time well spent.
https://press.uchicago.edu/Misc/Chicago/311260.html has a copy of the introduction.
On Sun, 11 Jan 2026 17:58:31 -0500, c186282 wrote:
Many stocks of fish are already depleted or nearly so,
and that's just at CURRENT levels of consumption. The "bounty of the
sea" is NOT unlimited, not at all.
Some of the species I see in the market would have been classified as cat food 60 years ago.
On 11/01/2026 21:35, rbowman wrote:
On Sun, 11 Jan 2026 11:26:50 +0000, The Natural Philosopher wrote:
On 11/01/2026 01:51, c186282 wrote:
Stick to my estimation that England perhaps ranks
-a as the "most invaded" country ever-a EfOe
Yes, until 1066, after which it became the least.
Nothing like having a navy comprised of pirates.
And a merchant class comprised of pirates... Wasn't there a Monty Python
sketch about that?
Dunno. There is a Trumpian experiment ongoing to see exactly where that leads, of course...
In the end, we developed democracy. The amount of loot the war winners gained was always less than they spent on defeating the opposition.
Probably the USA will end up doing the same.
After having explored all the other alternatives.
Elizabeth I is quoted as saying 'war is such a chancy thing' or similar.
On Mon, 12 Jan 2026 07:52:46 -0700, Peter Flass wrote:
Now if only computer people could follow this rule. Our rule seems
to be "why not add just this one more feature"
We follow EinsteinrCOs rule: rCLthings should be as complicated as they
need to be, but no more.rCY
On 2026-01-12, Lawrence DrCOOliveiro <ldo@nz.invalid> wrote:
On Mon, 12 Jan 2026 07:52:46 -0700, Peter Flass wrote:
Now if only computer people could follow this rule. Our rule seems
to be "why not add just this one more feature"
We follow EinsteinrCOs rule: rCLthings should be as complicated as they
need to be, but no more.rCY
s/complicated/simple/
On 2026-01-12, rbowman <bowman@montana.com> wrote:
On Sun, 11 Jan 2026 17:58:31 -0500, c186282 wrote:
Many stocks of fish are already depleted or nearly so,
and that's just at CURRENT levels of consumption. The "bounty of the >>> sea" is NOT unlimited, not at all.
Some of the species I see in the market would have been classified as cat
food 60 years ago.
I've heard this described as "eating our way down the food chain".
On Tue, 13 Jan 2026 00:24:02 GMT, Charlie Gibbs wrote:
On 2026-01-12, Lawrence DrCOOliveiro <ldo@nz.invalid> wrote:
On Mon, 12 Jan 2026 07:52:46 -0700, Peter Flass wrote:
Now if only computer people could follow this rule. Our rule seems
to be "why not add just this one more feature"
We follow EinsteinrCOs rule: rCLthings should be as complicated as they
need to be, but no more.rCY
s/complicated/simple/
Really?? The man who brought Riemann tensors into physics?
On Mon, 12 Jan 2026 17:03:19 GMT, Charlie Gibbs wrote:
To be fair, I'm sure a lot of computer people are doing this under
duress, being ordered by the marketroids (who have the ear of
management) to add yet another shiny thing.
ArenrCOt you glad the Free Software world isnrCOt driven by marketroids?
Lawrence =?iso-8859-13?q?D=FFOliveiro?= <ldo@nz.invalid> writes:Eric_S_Raymond_portrait.jpg
On Mon, 12 Jan 2026 17:03:19 GMT, Charlie Gibbs wrote:
To be fair, I'm sure a lot of computer people are doing this under
duress, being ordered by the marketroids (who have the ear of
management) to add yet another shiny thing.
ArenrCOt you glad the Free Software world isnrCOt driven by marketroids?
It's not?
AI Overview
https://upload.wikimedia.org/wikipedia/commons/e/e2/
Eric S. Raymond (ESR), the well-known open-source advocate, began
charging speaking fees for corporate events in 1999 but waives fees
for schools and user groups; however, specific current fee amounts
aren't publicly listed, requiring direct contact with booking agents
or his website, though general estimates for similar speakers suggest
fees could range from thousands to tens of thousands depending on the
event and his involvement.
On 2026-01-12, Scott Lurndal <scott@slp53.sl.home> wrote:
Lawrence =?iso-8859-13?q?D=FFOliveiro?= <ldo@nz.invalid> writes:
ArenrCOt you glad the Free Software world isnrCOt driven by marketroids?
It's not?
AI Overview
https://upload.wikimedia.org/wikipedia/commons/e/e2/ Eric_S_Raymond_portrait.jpg
Eric S. Raymond (ESR), the well-known open-source advocate, began
charging speaking fees for corporate events in 1999 but waives fees
for schools and user groups; however, specific current fee amounts
aren't publicly listed, requiring direct contact with booking agents
or his website, though general estimates for similar speakers
suggest fees could range from thousands to tens of thousands
depending on the event and his involvement.
Was ESR ever nearly as influential as he tried to make it look, though?
OK, so he got speaking fees (unclear how often or how much), but did he
have much effect on actual FOSS projects?
rbowman <bowman@montana.com> writes:
On Mon, 12 Jan 2026 08:11:36 -0800, John Ames wrote:
A friend who was active in a Norway based church told me a lot of the
Sons are really German. It's a nice clubhouse so why build your own when >>you can invade Norway?
Small village where my father grew up had two churches. A norwegian lutheran church and a german lutheran church (ALC and Wisconson Synod,
IIRC). Never the twain shall meet.
On Mon, 12 Jan 2026 19:52:39 GMT, Scott Lurndal wrote:
rbowman <bowman@montana.com> writes:
On Mon, 12 Jan 2026 08:11:36 -0800, John Ames wrote:
A friend who was active in a Norway based church told me a lot of the
Sons are really German. It's a nice clubhouse so why build your own when >>> you can invade Norway?
Small village where my father grew up had two churches. A norwegian
lutheran church and a german lutheran church (ALC and Wisconson Synod,
IIRC). Never the twain shall meet.
No kidding. I was interested in the food, not the theology, but Immanuel Lutheran is ECLA. First Lutheran, about a mile away, is Missouri Synod. I think the Missouri folks consider the ELCA folks to be baby-raping, communistic, apostates. Both the pastor and assistant pastor at Immanuel
are women and that's a non-starter for LCMS.
Put it into the trash - it'd attract ten species of roving animals
... that fish smell is infinitely attractive. Don't think the garbage
service would be very friendly to a 50 pound concrete brick on top of
my trash bin .........
Oh, are nothing but slimy nasty fish to be found in the North Sea ???
rbowman <bowman@montana.com> writes:
On Mon, 12 Jan 2026 07:45:11 -0700, Peter Flass wrote:
On 1/11/26 19:19, c186282 wrote:
-a Alas without detailed records we may find old THINGS,"It's a ritual object."
-a but what they MEANT, their context, is forever lost. That's just
-a half a view of 'history'.
I've heard some fascinating explanations for the petroglyphs in the US
west. My personal explanation is the tribe sent bored teenagers up to a
lookout where lacking cellphones they chipped away at the rocks.
You may not be that far off. Have a read of _The Nature Of Paelolithic
Art_ (R. Dale Guthrie) - itrCOs not short but if yourCOre interested in that sort of thing, itrCOd be time well spent.
https://press.uchicago.edu/Misc/Chicago/311260.html has a copy of the introduction.
Like the Norse graffiti at Maes Howe that says something like 'Hagars
wife is a good fuck'
Concerning graffiti, nothing changes...
I tend to agree ... most petroglyphs DO look like things bored
kiddies would scrawl. Lacking spray-paint, well, you use what you
have.
Do you think Microsoft would get involved with a GPL project?
How many other FOSS projects use the MIT, Apache, Zero Clause BSD, or
other permissive licenses?
"Abendessen" ("evening meal")ITYM "meal eaten whilst trying to figure out why the damn program keeps crashing"
On the other hand, this coming July 4 sounds like an appropriate time to
wind up the Great Experiment. Two hundred and fifty years to the day...
How many other FOSS projects use the MIT, Apache, Zero Clause BSD, or
other permissive licenses?
I don't know offhand, but I've always been under the impression the
licenses you mentioned are all relatively widespread.
AMAZING how TINY ideological diffs can be turned into MAJOR, kill 'em
all, rifts
On 1/12/26 15:46, The Natural Philosopher wrote:
On 12/01/2026 15:44, Scott Lurndal wrote:
-a-a He says as the gelatinous fishy slime slides down his throat EfOe
-a-a We had it twice a year for decades.-a Yes, butter helps to mask
the the flavor, but nothing masks the consistency (or lack thereof).
Oysters: "like swallowing someone else's cold snot"
-a They're awful .....
-a Oh, are nothing but slimy nasty fish to be found
-a in the North Sea ???
On Mon, 12 Jan 2026 18:17:25 -0500, c186282 wrote:
I tend to agree ... most petroglyphs DO look like things bored
kiddies would scrawl. Lacking spray-paint, well, you use what you
have.
https://www.ancientartarchive.org/handprints-universal-symbol-humanity/
https://www.youtube.com/watch?v=4I49uteH-EA
I've had an informal interest in experimental archaeology. If you say to yourself "I'm here in this environment, how do I make a living?" some of
the theories of armchair archaeologists don't make sense.
The hard part is viewing the scene with fresh eyes. I know how to make a figure 4 trap or deadfall. Do I have to assume Ogg never figured it out?
I've ground corn with a mano and metate. Can I assume an early human
wouldn't have figured out rubbing hard seeds between two rocks didn't make them easier to eat?
Perhaps, but it's so much _fun_ (if you're into that sort of thing).
On 2026-01-07 23:49, rbowman wrote:
On 2026-01-06 19:57, Charlie Gibbs wrote:
<snip>
It is possible that current C compilers signal many more problems that
back then, but not runtime errors.
gcc has become pickier. That isn't always a welcome thing when working
with legacy code and requires a search of the compiler options to get it
to shut up about such horrible heresies as assuming a function returns an
int.
If the code were mine, I would correct the code. Even back then, I did
not take the assumption that a function would return an integer :-D
I wrote explicit prototypes in the header file. :-)
If the code is not mine, I would use the compiler options instead.
Unless I got paid to maintain that code, then I would correct the code.
On 13 Jan 2026 06:31:43 GMT, Niklas Karlsson wrote:
How many other FOSS projects use the MIT, Apache, Zero Clause BSD, or
other permissive licenses?
I don't know offhand, but I've always been under the impression the
licenses you mentioned are all relatively widespread.
Precisely. Raymond's argument was restrictive licenses would deter FOSS development.
On Mon, 12 Jan 2026 22:58:32 -0500, c186282 wrote:
Put it into the trash - it'd attract ten species of roving animals
... that fish smell is infinitely attractive. Don't think the garbage
service would be very friendly to a 50 pound concrete brick on top of
my trash bin .........
You do realize there is water packed tuna?
and it's gone long before the trash panda gets wind of it. I do get
sardines in oil and after I get the fish out the can goes on the deck. Not
as popular as tuna juice but community cats will eat almost anything.
Except Blue Buffalo. The cats wouldn't eat it. The raccoon wouldn't eat
it. The skunk managed to choke it down.
| Sysop: | Amessyroom |
|---|---|
| Location: | Fayetteville, NC |
| Users: | 54 |
| Nodes: | 6 (0 / 6) |
| Uptime: | 12:20:47 |
| Calls: | 742 |
| Files: | 1,218 |
| D/L today: |
2 files (2,024K bytes) |
| Messages: | 183,175 |
| Posted today: | 1 |