• Undefined behaviour in C23

    From Martin Ward@mwardgkc@gmail.com to comp.compilers on Wed Aug 20 14:06:45 2025
    From Newsgroup: comp.compilers

    In the SEI CERT C Soding Standards we read:

    "According to the C Standard, Annex J, J.2 [ISO/IEC 9899:2024],
    the behavior of a program is undefined in the circumstances outlined
    in the following table."

    The table has 221 numbered cases and can be found here:

    <https://wiki.sei.cmu.edu/confluence/display/c/CC.%2BUndefined%2BBehavior>

    According to the C Standard Committee (paraphrasing) "You may eat
    from any tree in the garden of coding, except for any of the 221
    trees of undefined behaviour. If you eat from any of the 221 trees
    of undefined behaviour your program may die, either immediately or at
    some unspecified time in the future, or may do absolutely anything at
    any future time. You must study the Book of the Knowledge of Defined
    and Undefined (the 758 page C23 standard document) to learn exactly
    how to recognise each of the 221 trees of undefined behaviour.
    Please pay the cashier $250.00 to purchase a copy of the Book
    of the Knowledge of Defined and Undefined".


    \--
    Martin

    Dr Martin Ward | Email: [martin@gkc.org.uk](mailto:martin@gkc.org.uk) | <http://www.gkc.org.uk>
    G.K.Chesterton site: <http://www.gkc.org.uk/gkc> | Erdos number: 4
    [When a language is 50 years old and there is a mountain of legacy code that they really don't want to break, it accumulates a lot of cruft. If we were starting now we'd get something more like Go.

    On the other hand, there's the python approach in which they deprecate and remove little used and crufty features, but old python code doesn't work any more unless you go back and update it every year or two. -John]
    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From Kaz Kylheku@643-408-1753@kylheku.com to comp.compilers on Wed Aug 20 18:33:36 2025
    From Newsgroup: comp.compilers

    On 2025-08-20, Martin Ward <mwardgkc@gmail.com> wrote:
    In the SEI CERT C Soding Standards we read:

    "According to the C Standard, Annex J, J.2 [ISO/IEC 9899:2024],
    the behavior of a program is undefined in the circumstances outlined
    in the following table."

    The table has 221 numbered cases and can be found here:

    <https://wiki.sei.cmu.edu/confluence/display/c/CC.%2BUndefined%2BBehavior>

    According to the C Standard Committee (paraphrasing) "You may eat
    from any tree in the garden of coding, except for any of the 221
    trees of undefined behaviour. If you eat from any of the 221 trees
    of undefined behaviour your program may die, either immediately or at
    some unspecified time in the future, or may do absolutely anything at
    any future time. You must study the Book of the Knowledge of Defined
    and Undefined (the 758 page C23 standard document) to learn exactly
    how to recognise each of the 221 trees of undefined behaviour.
    Please pay the cashier $250.00 to purchase a copy of the Book
    of the Knowledge of Defined and Undefined".

    The list is incomplete.

    For instance, the behavior is undefined when translation units are linked into a
    program and contain the use of an external name that is neither defined by the standard, nor by those translation units.

    (Actual behavior may range from diagnosing an unresolved reference, to resolving
    it to something in the implementation for which no requirements are given in the
    standard.)

    The behavior is undefined when #include directive is processed that resolves neither to a standard header, nor to any of the files presented for processing by the implementation.

    (Actual behavior may be that the header doesn't resole to anything, which is a constraint violation. Or it may resolve to something in the implementation which
    replaces it with a sequence of tokens, for the nature of which ISO C imposes no requirements.)

    Any situation in the standard whereby we are not able to deduce the requirements
    for a given program or construct is undefined behavior.

    #include <windows.h> is undefined behavior.

    fdopen(0, "r") is undefined behavior.

    We can make a C impementation in which #include <windows.h> recursively deletes the current directory, right there at at compile time, and that implementation couldn't be called nonconforming because of that. The #include <windows.h> construct can have all the consequences that are given in the definition of undefined behavior: terminating translation or execution with or without a diagnostic message, behaving unpredictably, or in a documented manner characteristic of the implementation.

    All platform specific headers and functions are effectively documented extensions replacing undefined behavior, which another impelmentation could neglect to define, or define arbitrarily (including in evil ways).

    Once you grok the fact that almost real work in C takes place via undefined behavior (very few programs are maximally portable and strictly conforming) you stop sweating it.

    --
    TXR Programming Language: http://nongnu.org/txr
    Cygnal: Cygwin Native Application Library: http://kylheku.com/cygnal
    Mastodon: @Kazinator@mstdn.ca
    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From anton@anton@mips.complang.tuwien.ac.at to comp.compilers on Thu Aug 21 05:44:32 2025
    From Newsgroup: comp.compilers

    Martin Ward <mwardgkc@gmail.com> writes:
    [actually, John Levine writes:]
    [When a language is 50 years old and there is a mountain of legacy code that >they really don't want to break, it accumulates a lot of cruft.

    But there is a very vocal group of people who argue that programs that
    exercise undefined behaviour are already broken (and they often use
    stronger words that that) and that compilers are allowed to (and
    should) compile them to code that behaves differently than earlier
    compilers that the new compiler supposedly is just a new version of.

    So according to this argument, when something that the legacy code
    does is declared as undefined behaviour, this breaks this program.

    And the practice is that the people in C compiler maintenance reject
    bug reports as RESOLVED INVALID when the code exercises undefined
    behaviour, even when the code works as intended in earlier versions of
    the compiler and when the breakage could be easily fixed (e.g., for <https://gcc.gnu.org/bugzilla/show_bug.cgi?id=66804> and <https://gcc.gnu.org/bugzilla/show_bug.cgi?id=65709> by using movdqu
    instead of movdqa).

    But they not always do so: The SATD function from the SPEC benchmark 464.h264ref exercises undefined behaviour, and a pre-release version
    of gcc-4.8 generated code that did not behave as intended. The
    release version of gcc-4.8 compiled 464.h264ref as intended (but later
    a similar case that was not in a SPEC program <https://gcc.gnu.org/bugzilla/show_bug.cgi?id=66875> was rejected as
    RESOLVED INVALID). When I brought this up, the reactions reached from
    flat-out denial that it ever happened (despite it being widely
    publicized <https://lwn.net/Articles/544123/>) through a claim that
    the "optimization" turned out to have no benefit (and yet the similar
    case mentioned above still was "optimized" in a later gcc version) to
    a statement along the lines that 464.h264ref is a relevant benchmark.

    The last reaction seems to be the most plausible to me. The people
    working on the optimizers tend to evaluate their performance on a
    number of benchmarks, i.e., "relevant benchmarks", and of course these benchmarks must be compiled as intended, so that's what happens. My
    guess is that "relevant benchmarks" are industry standard benchmarks
    like SPEC, but also programs coming from paying customers.

    They also have their test suites of programs for regression testing,
    and any behavioural change in these programs that is visible in this
    regression testing probably leads to applying the optimization in a
    less aggressive way.

    How do tests get added into the regression test suite? Ideally, if
    somebody reports a case where a program behaves in one way in an
    earlier version of the same compiler and differently in a later
    version, that program and its original behaviour should usually be
    added to the test suite <https://www.complang.tuwien.ac.at/papers/ertl17kps.pdf>, but in gcc
    this does not happen (see the bug reports linked to above).
    Apparently gcc has some other criteria for adding programs to the test
    suite.

    So, is C still usable when you do not maintain one of those programs
    that are considered to be relevant by C compiler maintainers? My
    experience is that the amount of breakage for the code I maintain has
    been almost non-existent in the last 15 years. A big part of that is
    that we use lots of flags to tell the compiler that certain behaviour
    is defined even if the C standard does not define it. Currently we
    try the following flags with the versions of gcc or clang that support
    them:

    -fno-gcse -fcaller-saves -fno-defer-pop -fno-inline -fwrapv
    -fchar-unsigned -fno-strict-aliasing -fno-cse-follow-jumps
    -fno-reorder-blocks -fno-reorder-blocks-and-partition
    -fno-toplevel-reorder -fno-trigraphs -falign-labels=1 -falign-loops=1 -falign-jumps=1 -fno-delete-null-pointer-checks -fcf-protection=none -fno-tree-vectorize -mllvm=--tail-dup-indirect-size=0

    Some of these flags just disable certain transformations; in those
    cases there is no flag for defining the language in the way that our
    program relies on, but only the optimization transforms it in a way
    that is contrary to our intentions. In other cases, in particular -fno-tree-vectorize, using the flag just avoids slowdowns from the "optimization".

    Another big part of the lack of breakage experience is probably the
    code in the regression tests of the compiler, whatever the criteria
    are used for including this code. I.e., our code rides in the
    slipstream of this code.

    On the other hand, there's the python approach in which they deprecate and >remove little used and crufty features, but old python code doesn't work any >more unless you go back and update it every year or two. -John]

    Is it so bad with Python? From what I read, after the huge problems
    that Python had with migrating the existing code base from Python2 to
    Python3 (where Python3 was intentionally not backwards compatible with Python2), they had decided not to make backwards-incompatible changes
    to the language in the future.

    - anton
    --
    M. Anton Ertl
    anton@mips.complang.tuwien.ac.at
    http://www.complang.tuwien.ac.at/anton/
    [python may say they're not making backwards-incompatible changes
    but 3.13 will delete some modules I use from the standard library.
    It's not a huge problem to find copies of them and install them
    locallb, but if I don't, my scripts will break. -John]
    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From David Brown@david.brown@hesbynett.no to comp.compilers on Thu Aug 21 15:02:23 2025
    From Newsgroup: comp.compilers

    On 20/08/2025 20:33, Kaz Kylheku wrote:
    On 2025-08-20, Martin Ward <mwardgkc@gmail.com> wrote:
    In the SEI CERT C Soding Standards we read:

    "According to the C Standard, Annex J, J.2 [ISO/IEC 9899:2024],
    the behavior of a program is undefined in the circumstances outlined
    in the following table."

    The table has 221 numbered cases and can be found here:

    <https://wiki.sei.cmu.edu/confluence/display/c/CC.%2BUndefined%2BBehavior> >>
    According to the C Standard Committee (paraphrasing) "You may eat
    from any tree in the garden of coding, except for any of the 221
    trees of undefined behaviour. If you eat from any of the 221 trees
    of undefined behaviour your program may die, either immediately or at
    some unspecified time in the future, or may do absolutely anything at
    any future time. You must study the Book of the Knowledge of Defined
    and Undefined (the 758 page C23 standard document) to learn exactly
    how to recognise each of the 221 trees of undefined behaviour.
    Please pay the cashier $250.00 to purchase a copy of the Book
    of the Knowledge of Defined and Undefined".

    The list is incomplete.


    Under "4. Conformance", the C standards says :

    """
    If a "shall" or "shall not" requirement that appears outside of a
    constraint or runtime-constraint is violated, the behavior is undefined.
    Undefined behaviour is otherwise indicated in this International
    Standard by the words "undefined behavior" or by the omission of any
    explicit definition of behavior. There is no difference in emphasis
    among these three; they all describe "behavior that is undefined".
    """

    So no list could ever be complete here, since anything whose behaviour
    is not defined in the C standards is undefined behaviour. I have
    always found that slightly at odds with the definition under "3.
    Terms, definitions, and symbols" of "behavior, upon use of a
    nonportable or erroneous program construct or of erroneous data, for
    which this International Standard imposes no requirements". In my
    mind, things like externally defined functions (used correctly) could
    be considered UB by the section 4 definitions but not by the section 3 definitions.


    All platform specific headers and functions are effectively documented extensions replacing undefined behavior, which another impelmentation could neglect to define, or define arbitrarily (including in evil ways).

    Once you grok the fact that almost real work in C takes place via undefined behavior (very few programs are maximally portable and strictly conforming) you
    stop sweating it.


    Yes, indeed - though that is the "section 4" type of UB rather than
    the "section 3" definition - erroneous code and data should of course
    be avoided at all times.


    People who think UB is a bad thing should spend a little time thinking
    about the phrase "garbage in, garbage out". If you write nonsensical
    code, or give sensible code nonsensical data, you can't expect sensible results. This has been known since the dawn of programmable computers
    (and comes from mathematics and the domains of functions):

    """
    On two occasions I have been asked, rCo "Pray, Mr. Babbage, if you put
    into the machine wrong figures, will the right answers come out?" ... I
    am not able rightly to apprehend the kind of confusion of ideas that
    could provoke such a question
    """


    UB (both definitions) is an essential part of all programming languages
    - after all, if you have a bug in your code, you have UB, and no
    programming language has made it impossible to write bugs in your code.
    C just has some things that are undefined in C but defined in some other languages, and it is a bit more open and honest about UB than many
    language definitions.
    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From Martin Ward@mwardgkc@gmail.com to comp.compilers on Thu Aug 21 15:11:08 2025
    From Newsgroup: comp.compilers

    On 20/08/2025 14:06, John wrote:

    When a language is 50 years old and there is a mountain of legacy code that

    they really don't want to break, it accumulates a lot of cruft. If we were starting now we'd get something more like Go.

    On the other hand, there's the python approach in which they deprecate and remove little used and crufty features, but old python code doesn't work any more unless you go back and update it every year or two. -John]



    Legacy behaviour and undefined behaviour are orthogonal concepts.


    Any language ought to have fully defined behaviour: even if the
    definition is simply "syntax error" or "the program exits with a
    suitable error message". A language can be extended with new
    constructs and new functions: then the behaviour changes from "syntax
    error" or "error message" to the new functionality.

    Whether or not old behaviour is preserved is a completely
    separate issue.

    \--
    Martin

    Dr Martin Ward | Email: [martin@gkc.org.uk](mailto:martin@gkc.org.uk) | <http://www.gkc.org.uk>
    G.K.Chesterton site: <http://www.gkc.org.uk/gkc> | Erdos number: 4
    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From Keith Thompson@Keith.S.Thompson+u@gmail.com to comp.compilers on Thu Aug 21 12:53:49 2025
    From Newsgroup: comp.compilers

    David Brown <david.brown@hesbynett.no> writes:
    [...]
    Under "4. Conformance", the C standards says :

    """
    If a "shall" or "shall not" requirement that appears outside of a
    constraint or runtime-constraint is violated, the behavior is undefined.
    Undefined behaviour is otherwise indicated in this International
    Standard by the words "undefined behavior" or by the omission of any
    explicit definition of behavior. There is no difference in emphasis
    among these three; they all describe "behavior that is undefined".
    """

    So no list could ever be complete here, since anything whose behaviour
    is not defined in the C standards is undefined behaviour. I have
    always found that slightly at odds with the definition under "3.
    Terms, definitions, and symbols" of "behavior, upon use of a
    nonportable or erroneous program construct or of erroneous data, for
    which this International Standard imposes no requirements". In my
    mind, things like externally defined functions (used correctly) could
    be considered UB by the section 4 definitions but not by the section 3 definitions.

    I don't see an inconsistency.

    A C program that includes a non-standard header that's not part of
    the program (e.g., `#include <windows.h>`) and calls a function
    declared in that header has undefined behavior as far as the C
    standard is concerned. The program could be compiled in a conforming environment that has its own <windows.h> header with a declaration
    for different implementation of the same name.

    That's undefined behavior under both the section 3 definition (use
    of a nonportable program construct) and the section 4 definition
    (the omission of any explicit definition of behavior).

    [...]

    UB (both definitions) is an essential part of all programming languages
    - after all, if you have a bug in your code, you have UB, and no
    programming language has made it impossible to write bugs in your code.
    C just has some things that are undefined in C but defined in some other languages, and it is a bit more open and honest about UB than many
    language definitions.

    No, a bug in your code is not necessarily undefined behavior. It could
    easily be code whose behavior is well defined by the language standard,
    but that behavior isn't what the programmer intended.

    --
    Keith Thompson (The_Other_Keith) Keith.S.Thompson+u@gmail.com
    void Void(void) { Void(); } /* The recursive call of the void */
    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From David Brown@david.brown@hesbynett.no to comp.compilers on Fri Aug 22 17:58:01 2025
    From Newsgroup: comp.compilers

    On 21/08/2025 21:53, Keith Thompson wrote:
    David Brown <david.brown@hesbynett.no> writes:
    [...]
    Under "4. Conformance", the C standards says :

    """
    If a "shall" or "shall not" requirement that appears outside of a
    constraint or runtime-constraint is violated, the behavior is undefined.
    Undefined behaviour is otherwise indicated in this International
    Standard by the words "undefined behavior" or by the omission of any
    explicit definition of behavior. There is no difference in emphasis
    among these three; they all describe "behavior that is undefined".
    """

    So no list could ever be complete here, since anything whose behaviour
    is not defined in the C standards is undefined behaviour. I have
    always found that slightly at odds with the definition under "3.
    Terms, definitions, and symbols" of "behavior, upon use of a
    nonportable or erroneous program construct or of erroneous data, for
    which this International Standard imposes no requirements". In my
    mind, things like externally defined functions (used correctly) could
    be considered UB by the section 4 definitions but not by the section 3
    definitions.

    I don't see an inconsistency.

    A C program that includes a non-standard header that's not part of
    the program (e.g., `#include <windows.h>`) and calls a function
    declared in that header has undefined behavior as far as the C
    standard is concerned. The program could be compiled in a conforming environment that has its own <windows.h> header with a declaration
    for different implementation of the same name.

    That's undefined behavior under both the section 3 definition (use
    of a nonportable program construct) and the section 4 definition
    (the omission of any explicit definition of behavior).

    [...]

    If you declare and call a function "foo" that is written in fully
    portable C code, but not part of the current translation unit being
    compiled (perhaps it has been separately compiled or included in a
    library), then it would be UB by the section 4 definition (since the C standards don't say anything about what "foo" does, nor does your code).

    But the code that calls "foo" is portable and not erroneous, so it is
    not UB by the section 3 definition.

    Perhaps it is possible to argue that that it is UB by both definitions,
    or defined behaviour by both definitions - with enough pushing and
    squeezing of how the words of the standard are interpreted. But it is,
    at best, unclear and poorly explained - even if you don't agree that it
    is inconsistent.

    Add to that, the C standard has a specific term for features that are non-portable but not undefined behaviour - "implementation-defined
    behaviour". Code that relies on "int" being 32-bit is not portable, but
    it is not UB when compiled on implementations for which "int" /is/ 32-bit.

    As I see it, there are (at least) two significantly different types of
    UB in C, by the definitions in the standard. If you call an external non-portable function "foo" in your code, the compiler assumes that
    doing so does not result in the execution of UB - despite that being
    "type 4" UB. On the other hand, if your code dereferences a pointer,
    the compiler can assume that the pointer is not invalid - because
    dereferencing an invalid pointer would be "type 3" UB.

    UB (both definitions) is an essential part of all programming languages
    - after all, if you have a bug in your code, you have UB, and no
    programming language has made it impossible to write bugs in your code.
    C just has some things that are undefined in C but defined in some other
    languages, and it is a bit more open and honest about UB than many
    language definitions.

    No, a bug in your code is not necessarily undefined behavior. It could easily be code whose behavior is well defined by the language standard,
    but that behavior isn't what the programmer intended.


    When I write code, /I/ define what the behaviour of the code should be.
    A bug in the code means it is not acting according to my definitions -
    it is UB. It may still be acting according to the definitions of the C abstract machine given in the C standards (you are correct there). Even
    if it has C-standard UB, it will still be acting according to the
    definitions of the target machine's instruction set. Behaviour is
    defined on multiple levels, only one of which is the C standard.
    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From David Brown@david.brown@hesbynett.no to comp.compilers on Fri Aug 22 18:42:01 2025
    From Newsgroup: comp.compilers

    On 21/08/2025 07:44, anton@mips.complang.tuwien.ac.at wrote:
    Martin Ward <mwardgkc@gmail.com> writes:
    [actually, John Levine writes:]
    [When a language is 50 years old and there is a mountain of legacy code that >> they really don't want to break, it accumulates a lot of cruft.

    But there is a very vocal group of people who argue that programs that exercise undefined behaviour are already broken (and they often use
    stronger words that that) and that compilers are allowed to (and
    should) compile them to code that behaves differently than earlier
    compilers that the new compiler supposedly is just a new version of.

    Yes.

    It is good that compilers often support ways to get the "old" behaviour
    if the user wants. But new compiler versions should not be held back by
    the limitations of old compilers - that would stifle progress. Imagine
    if car manufacturers had to limit the speeds of new cars to 10 miles per
    hour, because some drivers a century ago assumed that they could safely
    put their foot flat on the accelerator without hitting the horse and
    cart in front of them.

    And also remember that broken code is not necessarily useless code. For programs of reasonable size, very few are completely bug-free. And yet
    we still manage to use them if they are good enough, despite being
    imperfect.


    So according to this argument, when something that the legacy code
    does is declared as undefined behaviour, this breaks this program.

    And the practice is that the people in C compiler maintenance reject
    bug reports as RESOLVED INVALID when the code exercises undefined
    behaviour, even when the code works as intended in earlier versions of
    the compiler and when the breakage could be easily fixed (e.g., for <https://gcc.gnu.org/bugzilla/show_bug.cgi?id=66804> and <https://gcc.gnu.org/bugzilla/show_bug.cgi?id=65709> by using movdqu
    instead of movdqa).


    I don't see any problem with these being marked as "resolved invalid".

    There is definitely a challenge in writing C code that is maximally
    efficient on a wide variety of compilers - old and new, powerfully
    optimising and weakly optimising, and for different target
    architectures. C code can't always be written in an ideal and fully
    portable way. This can be handled by using abstractions, compiler
    detection and conditional compilation for things like block copies - use unaligned non-conforming large moves if you know it is safe on a
    particular implementation, and fall back to safe but possibly slow
    memcpy() (or memmove()) in general. That kind of solution, of course,
    has its own disadvantages in development time, code complexity, testing,
    etc. The C programming world is not perfect.

    But the solution is certainly /not/ to say that people everyone correct
    C code and compiling with high optimisations should get slower results
    because someone else previously wrote code that made unwarranted and
    unchecked assumptions about particular compilers and particular target processors.

    But they not always do so: The SATD function from the SPEC benchmark 464.h264ref exercises undefined behaviour, and a pre-release version
    of gcc-4.8 generated code that did not behave as intended. The
    release version of gcc-4.8 compiled 464.h264ref as intended (but later
    a similar case that was not in a SPEC program <https://gcc.gnu.org/bugzilla/show_bug.cgi?id=66875> was rejected as
    RESOLVED INVALID).

    So the gcc developers made an exception for a particularly important,
    useful and common case? Again that doesn't sound unreasonable to me.
    Sometimes there are trade-offs - some code is more important than other
    code. The C compiler development world is not perfect either.

    When I brought this up, the reactions reached from
    flat-out denial that it ever happened (despite it being widely
    publicized <https://lwn.net/Articles/544123/>) through a claim that
    the "optimization" turned out to have no benefit (and yet the similar
    case mentioned above still was "optimized" in a later gcc version) to
    a statement along the lines that 464.h264ref is a relevant benchmark.


    Maybe this particular case was handled badly, or at least the
    communications involved were bad. It was over a decade ago, in a
    pre-release candidate compiler. (Pre-release candidates are used
    precisely to check if changes cause trouble with real-world code.) How
    long are you going to hold a grudge about this?

    The last reaction seems to be the most plausible to me. The people
    working on the optimizers tend to evaluate their performance on a
    number of benchmarks, i.e., "relevant benchmarks", and of course these benchmarks must be compiled as intended, so that's what happens. My
    guess is that "relevant benchmarks" are industry standard benchmarks
    like SPEC, but also programs coming from paying customers.


    For gcc, a typical benchmark is the Linux kernel and a wide selection of
    open source programs - people do full rebuilds of whole Linux
    distributions as part of testing before full compiler releases. SPEC is
    a special case, because these programs are used on a wide variety of C compilers for comparison between toolchains, not just for one compiler.

    They also have their test suites of programs for regression testing,
    and any behavioural change in these programs that is visible in this regression testing probably leads to applying the optimization in a
    less aggressive way.


    I would assume that they try to avoid UB in their test suite code
    (though of course gcc developers can have bugs and mistakes like anyone
    else). Sometimes test suite code is fixed when new bugs are found in it.

    How do tests get added into the regression test suite? Ideally, if
    somebody reports a case where a program behaves in one way in an
    earlier version of the same compiler and differently in a later
    version, that program and its original behaviour should usually be
    added to the test suite <https://www.complang.tuwien.ac.at/papers/ertl17kps.pdf>, but in gcc
    this does not happen (see the bug reports linked to above).

    In what bizarre world would that be "ideal" ?

    If you want gcc 4.8 without tree optimisations, you can get it. If you
    want to use gcc 15 and not enable tree optimisations, that's fine too.
    If you want to write code that can be highly optimised with automatic generation of vector instructions that only work on aligned data, don't
    faff around going out of your way to write bad C code that messes with
    pointer types to create unaligned accesses. It's your choice. Let the
    rest of us that get our data alignment correct (and you get that
    naturally in C - you only get the UB if you've played silly buggers with pointer casts to write "smart" code) get faster results.

    Examples can be added to the test suite if they are useful. They can be
    added to test new features - there is no point in a test to see if the
    compiler generates code that matches an old compiler version unless
    there is a specific new feature flag to give defined semantics matching
    the old behaviour. (An example of that would be the "-fno-delete-null-pointer-check" flag.)


    Apparently gcc has some other criteria for adding programs to the test
    suite.

    So, is C still usable when you do not maintain one of those programs
    that are considered to be relevant by C compiler maintainers? My
    experience is that the amount of breakage for the code I maintain has
    been almost non-existent in the last 15 years. A big part of that is
    that we use lots of flags to tell the compiler that certain behaviour
    is defined even if the C standard does not define it.

    That sounds like you have a solution to your problem.

    Currently we
    try the following flags with the versions of gcc or clang that support
    them:

    -fno-gcse -fcaller-saves -fno-defer-pop -fno-inline -fwrapv
    -fchar-unsigned -fno-strict-aliasing -fno-cse-follow-jumps -fno-reorder-blocks -fno-reorder-blocks-and-partition
    -fno-toplevel-reorder -fno-trigraphs -falign-labels=1 -falign-loops=1 -falign-jumps=1 -fno-delete-null-pointer-checks -fcf-protection=none -fno-tree-vectorize -mllvm=--tail-dup-indirect-size=0


    That sounds like you have code that uses a great deal of UB and relies
    on a wide range of very specific code generation and semantics that are
    not defined anywhere, in the C standards or compiler documentation.
    Maybe that's what you need for your projects, and if it works for you,
    fine. But it is a very unusual situation, and cannot be extrapolated to
    a non-negligible amount of C code. (Some flags, such as "-fwrapv" and "-fno-strict-aliasing", are needed to counter unwarranted assumptions in
    a more significant body of existing C code.)

    Some of these flags just disable certain transformations; in those
    cases there is no flag for defining the language in the way that our
    program relies on, but only the optimization transforms it in a way
    that is contrary to our intentions. In other cases, in particular -fno-tree-vectorize, using the flag just avoids slowdowns from the "optimization".

    You know better than the solid majority of programmers that
    "optimisation" is as much an art as a science, and that getting the best
    from a combination of code, compiler and target processor is no simple
    task. Compilers enable optimisations in groups (-O1, -O2, -O3, etc.)
    based on what usually gives better results for a range of code bases and
    a range of target devices - there are no guarantees for any particular combination.


    Another big part of the lack of breakage experience is probably the
    code in the regression tests of the compiler, whatever the criteria
    are used for including this code. I.e., our code rides in the
    slipstream of this code.

    On the other hand, there's the python approach in which they deprecate and >> remove little used and crufty features, but old python code doesn't work any >> more unless you go back and update it every year or two. -John]

    Is it so bad with Python? From what I read, after the huge problems
    that Python had with migrating the existing code base from Python2 to
    Python3 (where Python3 was intentionally not backwards compatible with Python2), they had decided not to make backwards-incompatible changes
    to the language in the future.


    IME there are only sometimes issues with new Python versions, but the
    Python 2 to 3 incompatibilities are still a widespread problem.
    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From anton@anton@mips.complang.tuwien.ac.at to comp.compilers on Fri Aug 22 17:16:48 2025
    From Newsgroup: comp.compilers

    David Brown <david.brown@hesbynett.no> writes:
    On 21/08/2025 21:53, Keith Thompson wrote:
    David Brown <david.brown@hesbynett.no> writes:
    UB (both definitions) is an essential part of all programming languages
    - after all, if you have a bug in your code, you have UB, and no
    programming language has made it impossible to write bugs in your code.
    C just has some things that are undefined in C but defined in some other >>> languages, and it is a bit more open and honest about UB than many
    language definitions.

    No, a bug in your code is not necessarily undefined behavior. It could
    easily be code whose behavior is well defined by the language standard,
    but that behavior isn't what the programmer intended.


    When I write code, /I/ define what the behaviour of the code should be.
    A bug in the code means it is not acting according to my definitions -
    it is UB.

    Yes, Humpty Dumpty. Meanwhile, the rest of the world says that a
    program exercises undefined behaviour when the *programming language* specification does not define what the program does (or explicitly
    undefines it), and calls programs incorrect (or, colloquially, buggy)
    that do not behave as the *program* specification or requirements
    demand; a buggy program may be defined according to the programming
    language specification; e.g., I believe that the following program is
    defined according to one of the C standards:

    #include <stdio.h>

    int main(void)
    {
    puts("B");
    }

    If the specification of the program is to print "A" followed by a
    newline, the program is incorrect, even though its behaviour is
    defined.

    A correct program may be undefined according to the programming
    language specification (see Kaz Kylheku's examples).

    Which brings us to your claim:
    UB (both definitions) is an essential part of all programming languages

    This is nonsense. If the programming language defines the behaviour
    of all programs it accepts, and rejects all the others, it does not
    have undefined behaviour. As an example, Java tries to live up to an
    even higher standard (no implementation-dependent behaviour aka "write
    once, run everywhere"), but I am not sure if it succeeds in that.

    - anton
    --
    M. Anton Ertl
    anton@mips.complang.tuwien.ac.at
    http://www.complang.tuwien.ac.at/anton/
    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From Keith Thompson@Keith.S.Thompson+u@gmail.com to comp.compilers on Fri Aug 22 15:11:18 2025
    From Newsgroup: comp.compilers

    comp.lang.c would probably be a better place for this discussion,
    but cross-posting between moderated and unmoderated newsgroups is
    likely to cause problems.

    David Brown <david.brown@hesbynett.no> writes:
    On 21/08/2025 21:53, Keith Thompson wrote:
    [...]
    If you declare and call a function "foo" that is written in fully
    portable C code, but not part of the current translation unit being
    compiled (perhaps it has been separately compiled or included in a
    library), then it would be UB by the section 4 definition (since the C standards don't say anything about what "foo" does, nor does your code).

    If the translation unit that defined "foo" is part of your program, then
    your code *does* define its behavior. Linking multiple translation
    units into a program is specified by the C standard; it's translation
    phase 8.

    But the code that calls "foo" is portable and not erroneous, so it is
    not UB by the section 3 definition.

    If "foo" is defined by your program, either in the current
    translation unit or in another one, the call is well defined
    (assuming "foo" doesn't do something silly like dividing by zero).
    If "foo" is defined outside your program, the C standard has nothing
    to say about it. It could even be implemented in a language other
    than C.

    The *behavior* of such a call is not portable. (And the execution of
    such a call is definitely undefined behavior if the visible declaration
    is inconsistent with the definition.)

    The section 3 definition of "undefined behavior" is a bit informal.
    It's not clear what it means by "erroneous", for example. Section
    4 is more precise, and states that UB can be indicated "by the
    omission of any explicit definition of behavior" (in the standard).
    The standard omits any definition of the behavior of foo().

    [...]

    Add to that, the C standard has a specific term for features that are non-portable but not undefined behaviour - "implementation-defined behaviour". Code that relies on "int" being 32-bit is not portable, but
    it is not UB when compiled on implementations for which "int" /is/ 32-bit.

    That's not what "implementation-defined behavior" means in C.
    Cases of implementation-defined behavior are explicitly called out in
    the standard, and an implementation must document how it treats each
    instance of implementation-defined behavior. Each implementation
    must document the range of int. There is no such requirement for
    the behavior of "foo" defined in some non-standard header.

    No, a bug in your code is not necessarily undefined behavior. It could
    easily be code whose behavior is well defined by the language standard,
    but that behavior isn't what the programmer intended.

    When I write code, /I/ define what the behaviour of the code should be.
    A bug in the code means it is not acting according to my definitions -
    it is UB. It may still be acting according to the definitions of the C abstract machine given in the C standards (you are correct there). Even
    if it has C-standard UB, it will still be acting according to the
    definitions of the target machine's instruction set. Behaviour is
    defined on multiple levels, only one of which is the C standard.

    "Undefined behavior" is a technical term defined by the C standard.
    It's not just behavior that is not defined. It is behavior that
    is not defined *by the C standard*. If I write printf("goodbye\n")
    when I meant to write printf("hello\n"), that's incorrect behavior,
    but it's not undefined behavior.

    --
    Keith Thompson (The_Other_Keith) Keith.S.Thompson+u@gmail.com
    void Void(void) { Void(); } /* The recursive call of the void */
    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From antispam@antispam@fricas.org to comp.compilers on Sat Aug 23 15:45:03 2025
    From Newsgroup: comp.compilers

    Martin Ward <mwardgkc@gmail.com> wrote:
    On 20/08/2025 14:06, John wrote:

    When a language is 50 years old and there is a mountain of legacy code that

    they really don't want to break, it accumulates a lot of cruft. If we were >> starting now we'd get something more like Go.

    On the other hand, there's the python approach in which they deprecate and >> remove little used and crufty features, but old python code doesn't work any >> m
  • From James Kuyper@jameskuyper@alumni.caltech.edu to comp.compilers,comp.lang.c on Mon Aug 25 22:13:07 2025
    From Newsgroup: comp.compilers

    David Brown <david.brown@hesbynett.no> writes:
    On 23/08/2025 00:11, Keith Thompson wrote:
    ...
    David Brown <david.brown@hesbynett.no> writes:
    On 21/08/2025 21:53, Keith Thompson wrote:
    [...]
    If you declare and call a function "foo" that is written in fully
    portable C code, but not part of the current translation unit being
    compiled (perhaps it has been separately compiled or included in a
    library), then it would be UB by the section 4 definition (since the C
    standards don't say anything about what "foo" does, nor does your code). ...
    The C standard does not define how this linking or combing is done - it
    only covers certain specific aspects of the linking that relate directly
    to C. The behaviour of the function "foo" here is not defined in the C standards, and if the source code is not available when translating a different translation unit, the behaviour of "foo" is undefined.

    I remember having an immensely frustrating discussion on this issue a
    couple of decades ago.
    If foo was written in fully portable C code, then that C code enables
    the C standard to define what the behavior of that code is. If you lose
    your last copy of the source code, you cannot confirm what that defined behavior should be, but the behavior remains defined by the code that
    has since gone missing.
    The absence of that source code will make it hard to determine whether
    the module can be safely linked to other modules, or to determine what
    the defined behavior of the linked program should be - but if the
    missing code said the right things to give the combined program defined behavior, the implementation is still required to generate that behavior.
    Not being able to determine what the standard-defined behavior of a
    program should be, is for practical purposes precisely as useless as if
    the behavior were undefined - but that doesn't make the behavior undefined.
    And for that reason, I don't see much point in continuing to debate this
    point. The last time I debated it, the discussion went on for many
    months, and was not at all illuminating.
    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From James Kuyper@jameskuyper@alumni.caltech.edu to comp.compilers,comp.lang.c on Tue Aug 26 13:41:14 2025
    From Newsgroup: comp.compilers

    On 2025-08-25 22:13, James Kuyper wrote:
    ...
    I remember having an immensely frustrating discussion on this issue a
    couple of decades ago.

    The discussion was on comp.std.c, the Subject: was "clrsc and UB", and
    my participation in the discussion started 2002-02-05.
    [Yeah, it's not like this is a new topic. -John]
    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From Michael S@already5chosen@yahoo.com.dmarc.email to comp.compilers,comp.lang.c on Tue Aug 26 22:28:44 2025
    From Newsgroup: comp.compilers

    On Tue, 26 Aug 2025 13:41:14 -0400
    James Kuyper <jameskuyper@alumni.caltech.edu> wrote:

    On 2025-08-25 22:13, James Kuyper wrote:
    ...
    I remember having an immensely frustrating discussion on this issue
    a couple of decades ago.

    The discussion was on comp.std.c, the Subject: was "clrsc and UB", and
    my participation in the discussion started 2002-02-05.
    [Yeah, it's not like this is a new topic. -John]

    Don't you mean "clrscr and UB" ?
    --- Synchronet 3.21a-Linux NewsLink 1.2
  • From James Kuyper@jameskuyper@alumni.caltech.edu to comp.compilers,comp.lang.c on Tue Aug 26 16:53:08 2025
    From Newsgroup: comp.compilers

    On 2025-08-26 15:28, Michael S wrote:
    On Tue, 26 Aug 2025 13:41:14 -0400
    James Kuyper <jameskuyper@alumni.caltech.edu> wrote:
    ...
    The discussion was on comp.std.c, the Subject: was "clrsc and UB", and
    my participation in the discussion started 2002-02-05.
    [Yeah, it's not like this is a new topic. -John]

    Don't you mean "clrscr and UB" ?

    I didn't think I could mess up something that short, so I typed it by
    hand. :-(
    --- Synchronet 3.21a-Linux NewsLink 1.2