• Re: configuguration tools, Portable Software (was: fledgling assembler programmer)

    From arnold@arnold@skeeve.com (Aharon Robbins) to comp.compilers on Fri Mar 31 07:10:46 2023
    From Newsgroup: comp.compilers

    In article <23-03-037@comp.compilers>,
    Kaz Kylheku <864-117-4973@kylheku.com> wrote:
    On 2023-03-28, Aharon Robbins <arnold@freefriends.org> wrote:
    Today, the C and C++ worlds are easier to program in, but it's still
    not perfect and I don't think I'd want to do without the autotools.
    Particularly for the less POSIX-y systems, like MinGW and OpenVMS.

    Counterpoint: Autotools are a real detriment to GNU project programs.

    When a release is cut of a typical GNU program, special steps
    are execute to prepare a tarball which has a compiled configure
    script.

    You cannot just do a "git clone" of a GNU program, and then run
    ./configure and build. You must run some "make boostrap" nonsense, and
    that requires you to have various Autotools installed, and in specific >versions!

    This is not inherent in the autotools; it's laziness on the part of the maintainers. For exactly this reason gawk has a very simple bootstrap.sh program that simply does a touch on various files so that configure will
    run without wanting to run the autotools.

    Most Autotools programs will not cleanly cross-compile. Autotools is the
    main reason why distro build systems use QEMU to create a virtual target >environment with native tools and libraries, and then build the >"cross-compiled" program as if it were native.

    QEMU wasn't around when the Autotools were first designed and
    implemented. Most end users don't need to cross compile either, and it
    is for them that I (and other GNU maintainers, I suppose) build my
    configure scripts.

    Yes, the world is different today than when the autotools were
    designed. No, the autotools are not perfect. I don't know of a better alternative though. And don't tell me CMake. CMake is an abomination, interweaving configuration with building instead of cleanly separating
    the jobs. Not to mention its stupid caching which keeps you from
    running a simple "make" after you've changed a single file.

    My TXR language project has a hand-written, not generated, ./configure >script. What you get in a txr-285.tar.gz tarball is exactly what you
    get if you do a "git clone" and "git checkout txr-285", modulo
    the presence of a .git directory and differing timestamps.

    You just ./configure and make.

    And for gawk it's ./bootstrap.sh && ./configure && make
    where bootstrap.sh only takes a few seconds.

    None of my configure-time tests require the execution of a program;
    For some situations, I have developed clever tricks to avoid it.

    And why should you, or anyone, be forced to develop such clever tricks?

    All of this simply justifies more the approach taken by newer languages,
    which is to move all the hard crap into the libraries. The language
    developers do all the hard work, instead of the application developers
    having to do it. This is great for people who want to just get their
    job done, which includes me most of the time. However, and this is a
    different discussion, it does lead to a generation of programmers who
    have *no clue* as to how to do the hard stuff should they ever need to.

    My opinion, of course.

    Arnold
    --
    Aharon (Arnold) Robbins arnold AT skeeve DOT com
    --- Synchronet 3.21b-Linux NewsLink 1.2
  • From anton@anton@mips.complang.tuwien.ac.at (Anton Ertl) to comp.compilers on Sun Apr 2 08:56:48 2023
    From Newsgroup: comp.compilers

    Kaz Kylheku <864-117-4973@kylheku.com> writes:
    When a release is cut of a typical GNU program, special steps
    are execute to prepare a tarball which has a compiled configure
    script.

    You cannot just do a "git clone" of a GNU program, and then run
    ./configure and build. You must run some "make boostrap" nonsense, and
    that requires you to have various Autotools installed, and in specific >versions!

    And the problem is?

    The git repo contains only the source code, useful for developers.
    The developers have stuff installed that someone who just wants to
    install the program does not necessarily want to install. E.g., in
    the case of Gforth, you need an older Gforth to build the kernel
    images that contain Forth code compiled to an intermediate
    representation. Therefore the tarballs contain a number of generated
    (or, as you say, "compiled") files, e.g., the configure script, the
    kernel images in case of Gforth, or the C files generated by Bison in
    case of some other compilers.

    If you go for the "git clone" route rather than building from the
    tarball, you don't get these amenities, but have to install all the
    tools that the developers use, and have to perform an additional step
    (usually ./autogen.sh) to produce the configure file. "make
    bootstrap" is unlikely to work, because at that stage you don't have a Makefile.

    I remember "make bootstrap" from gcc, where IIRC it compiles gcc first
    (stage1) with the pre-installed C compiler, then (stage2) with the
    result of stage1, and finally (stage3) again with the result of
    stage2; if there is a difference between stage2 and stage3, something
    is amiss.

    Anyway, tl;dr: If you just want to do "./configure; make", use the
    tarball.

    Most Autotools programs will not cleanly cross-compile. Autotools is tha
    main reason why distro build systems use QEMU to create a virtual target >environment with native tools and libraries, and then build the >"cross-compiled" program as if it were native.

    Clever! Let the machine do the work, rather than having to do manual
    work for each package.

    For instance, about a decade and a half ago I helped a company
    replace Windriver cruft with an in-house distribution. Windriver's >cross-compiled Bash didn't have job control! Ctrl-Z, fg, bg stuff no
    workie. The reason was that it was just cross-compiled straight, on an
    x86 build box. It couldn't run the test to detect job control support,
    and so it defaulted it off, even though the target machine had
    "gnu-linux" in its string. In the in-house distro, my build steps for
    bash exported numerous ac_cv_... internal variables to override the bad >defaults.

    That's the way to do it.

    Your idea seems to be that, when the value is not supplied, instead of
    a safe default (typically resulting in not using a feature), one
    should base the values on the configuration name of the system. I
    think the main problem with that is that for those systems most in
    need of cross-compiling the authors of the tests don't know good
    values for the configuration variables; for linux-gnu systems I
    usually configure and compile on the system.

    For some situations, I have developed clever tricks to avoid it. For >instance, if you want to know the size of a data type:. Here
    is a fragment:

    Great! Now we need someone who has enough time to replace the
    AC_CHECK_SIZEOF autoconf macro with your technique, and a significant
    part of the configuration variables that have to be supplied manually
    when cross-configuring Gforth become fully automatic.

    - anton
    --
    M. Anton Ertl
    anton@mips.complang.tuwien.ac.at
    http://www.complang.tuwien.ac.at/anton/
    --- Synchronet 3.21b-Linux NewsLink 1.2