• Static compiled packages option for apt

    From 153@110110.net@21:1/5 to All on Sat Jun 21 18:50:01 2025
    I’d like Debian to; instead or give the option, to download statically compiled packages through the fedora package manager

    That way instead of downloading a ton of packages, they can download one binary compiled against server or maybe even hardened libraries

    A statically compiled binary in unix doesn’t rely on shared libraries, so it’s one big file and I think it works on most unices without porting or recompiling

    I would also like everyone to reach out to their favorites distributions and their package managers for their opinion on statically compiled packages on their package managers

    I *think* it might be able to save a significant amount of bandwidth distributing stuff like apache or even OpenOffice in static form

    (All you need to do is add the ’-static’ option to cc/gcc/llvm)

    153

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From =?ISO-8859-1?Q?IOhannes_m_zm=F6lnig@21:1/5 to All on Sat Jun 21 19:40:02 2025
    Am 21. Juni 2025 18:05:54 MESZ schrieb 153@110110.net:
    I *think* it might be able to save a significant amount of bandwidth distributing stuff like apache or even OpenOffice in static form


    why do you think so?
    (and why do you think, Debian does not do this?)


    mfh.her.fsr
    IOhannes

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Antoine Le Gonidec@21:1/5 to All on Sat Jun 21 21:20:01 2025
    apt is probably not the best fit for that, as a lot of its codebase would be unused in the case of statically built binaries.

    I think you should rather work on a new packages manager (that could rely on apt libraries) with no dependencies management, that would allow it to be quicker than apt thanks to the reduced scope.

    Once you have a first proof of concept ready, then it would probably be a better time to try and push in favour of your suggestion.

    ---

    Disclaimer: I would not use such statically built binaries, as it goes against what I am looking for in a software distribution like Debian.

    -----BEGIN PGP SIGNATURE-----

    iHUEARYKAB0WIQSUsdxM90hewW6X7Jhja3j5HOuA2AUCaFcE3QAKCRBja3j5HOuA 2HGYAP4hHp4rpnm3t0XSmXOhjrffkICjq9qEUd4k1+daorx70wEA9Em95TcquxCL epVa1hB2NC0ButyfD5p8I2Q7JFI2fAM=
    =+8PJ
    -----END PGP SIGNATURE-----

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Andrey Rakhmatullin@21:1/5 to 153@110110.net on Sat Jun 21 22:50:01 2025
    On Sat, Jun 21, 2025 at 09:05:54AM -0700, 153@110110.net wrote:
    through the fedora package manager

    Copy-paste error, nice.

    --
    WBR, wRAR

    -----BEGIN PGP SIGNATURE-----

    iQJhBAABCgBLFiEEolIP6gqGcKZh3YxVM2L3AxpJkuEFAmhXGJAtFIAAAAAAFQAP cGthLWFkZHJlc3NAZ251cGcub3Jnd3JhckBkZWJpYW4ub3JnAAoJEDNi9wMaSZLh GCMP/1e63Bt+BesamafsGEr2h3ccO1x1U6Uc6Uc4Zt7BUl8gsudKkpWaGh9cfvD7 7EX/zSmm1MbzzEQfuVv4bQaGVB95gLWJlUOwyF1hiIh4XlQAjMaEaEYNKz+RKW7Q wKpoKfeVZ59iZv22BABVM7Ftd2MQ0kpS64rkTtkoPTjp511W/HX0kiHHbc0Cyba4 togUr8gjkbW+8fWYkv0YTTt4Pz1AVBFqNc0zV4uX0gtb56KDc8nn/33WMYIrsivQ IDNznuq3tTmy0d2uIvP62Jt+20kSpgsFh5I8wLsgDyU2fx3Dc1A6ZL3wJPlP5gXa 5dgIwXlwNRUpO6spT22Tj+fAMW4+kMfuySd8Hu3JChtVhacbf/4/OlPmbYkIepvy kX6T+kIHM4TeYj0+gHZJnBBJHzEky3ihKfD12BIwVYebYa+LHhSHsfRrPq7MtukC kyGADY2bvGLle4Hd9iz9G4fhJ6SgljUAJQHTdiEX6nSn3vmixhHaxE/Pvtpn8MuJ pLfkXAXdoosXMWvgPtnEuf9S1lO7ssMRAygCIB+/HndAU329kFxVsC0UipnprW+B 1mkzNULUIqiilTq/nISQhdzqrJKYBDMEnBjLlUP71vSQMk4khDxxm7QUGwMdJ7Su J9DCTvYOosd8tUjHXEvSwL/dqHgLf8qV+cQ4sEUOSgP8ScKW
    =Tlqy
    -----END PGP SIGNATURE-----

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Marc Haber@21:1/5 to debian@vv221.fr on Sun Jun 22 19:00:01 2025
    On Sat, 21 Jun 2025 21:15:41 +0200, Antoine Le Gonidec
    <debian@vv221.fr> wrote:
    Disclaimer: I would not use such statically built binaries, as it goes against >what I am looking for in a software distribution like Debian.

    What this totally anonymous person (troll?) is asking for is a totally
    new distribution. I'd recommend doing that in a totally new project
    and leave Debian alone.

    Greetings
    Marc
    --
    ---------------------------------------------------------------------------- Marc Haber | " Questions are the | Mailadresse im Header Rhein-Neckar, DE | Beginning of Wisdom " |
    Nordisch by Nature | Lt. Worf, TNG "Rightful Heir" | Fon: *49 6224 1600402

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From 153@110110.net@21:1/5 to All on Thu Jun 26 06:50:02 2025
    Basically even though theres a HUGE overhead when it comes to distributing static packages via apt, it can save time, bandwidth and cpu resources because.. yeah, I mean you’re only downloading one or two packages vs downloading up to 20 sometimes 700
    packages at a time. This saves resources !! But yeah, there is a huge overhead when it comes to downloading statically packages, along with running the binaries.. along with redownloading if it fails.


    i figure if OpenOffice, for example, has 700 dependencies, and instead of 700 different processes dedicated to downloading, a single process dedicated to downloading a large binary with all the dependencies compiled directly into the binary could in fact
    save bandwidth and possibly network resources, along with local and remote resources

    I figure 80mb of a static package is just as good as 50mb of shared packages, maybe even more..

    It fails when it comes to upgrading packages, since the shared library model is kind of the defacto standard of open source linux, and you’d have to sadly download the entire static binary and risk data loss principles instead of a single library to
    update a part of a program.. I don’t know, I tried to make an equation but it didn’t work out lol

    I personally think debian doesn’t distribute static packages because it takes up a ton of hard drive space and cpu ram to run a statically compiled program I think, seeing as the os copies binary data into the stack/heap, and yeah, there is a ton of
    users that use debian on old computing devices

    I personally would like debian to research a version of debian for high performance computers, or at least a fork of debian optimized for high performance computers; ready to occupy large sets of ram, hd space, and completely utilize new technology in
    x86 family processors Made after 2020 or so; where large sets of ram (64gb+) can just be occupied up to 35% for performance reason, such as caching and hopefully occupy high performance code

    And yeah within this month ill try to fork apt and make a patch for the src command, and possibly with the help of ldap and store it on some kind of local server (I guess rsync or nfs would be used?)

    But Id prefer if someone else did because im super taxed at work at the moment

    Oh man, not to act all crazy but, could you and the debian team talk about ldap integration at debian? (Or debian.org lol imagine getting a debian.org domain setup on your network haha)

    On Jun 21, 2025, at 10:13 AM, IOhannes m zmölnig <umlaeute@debian.org> wrote:

    Am 21. Juni 2025 18:05:54 MESZ schrieb 153@110110.net:
    I *think* it might be able to save a significant amount of bandwidth distributing stuff like apache or even OpenOffice in static form


    why do you think so?
    (and why do you think, Debian does not do this?)


    mfh.her.fsr
    IOhannes

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Ahmad Khalifa@21:1/5 to 153@110110.net on Thu Jun 26 13:20:01 2025
    On 26/06/2025 04:57, 153@110110.net wrote:
    Basically even though theres a HUGE overhead when it comes to distributing static packages via apt, it can save time, bandwidth and cpu resources because.. yeah, I mean you’re only downloading one or two packages vs downloading up to 20 sometimes 700
    packages at a time. This saves resources !! But yeah, there is a huge overhead when it comes to downloading statically packages, along with running the binaries.. along with redownloading if it fails.


    i figure if OpenOffice, for example, has 700 dependencies, and instead of 700 different processes dedicated to downloading, a single process dedicated to downloading a large binary with all the dependencies compiled directly into the binary could in
    fact save bandwidth and possibly network resources, along with local and remote resources

    I figure 80mb of a static package is just as good as 50mb of shared packages, maybe even more..

    I'm not sure what problem you're trying to solve here. Is it apt and
    dpkg taking time to `apt upgrade`?

    You can try apt's Merged Optimised Objects. But run the checker first.
    $ apt moo check


    --
    Regards,
    Ahmad

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)
  • From Theodore Ts'o@21:1/5 to 153@110110.net on Thu Jun 26 17:30:01 2025
    On Wed, Jun 25, 2025 at 08:57:42PM -0700, 153@110110.net wrote:

    I personally would like debian to research a version of debian for
    high performance computers, or at least a fork of debian optimized
    for high performance computers; ready to occupy large sets of ram,
    hd space, and completely utilize new technology in x86 family
    processors Made after 2020 or so; where large sets of ram (64gb+)
    can just be occupied up to 35% for performance reason, such as
    caching and hopefully occupy high performance code

    It's not clear what problem you are trying to solve here. First of
    all, just because you have a huge amount of memory doesn't mean that
    *using* extra memory is a good thing.

    For example, if you have a hundred processes all using the GNU C
    library, sharing it not only (a) saves memory, but also (b) can speed performance because the pages may already be in memory and so don't
    need to be loaded from disk, (c) can speed performance because the
    parts of the shared library might be in the CPU cache, and (d) when
    there is a security vulnerability, updating the shared library fixes
    all of the programs that use that shared library, where as if you use
    static linking, someone has to recompile all of the programs using
    that library, upload them to the Debian package archive, and then
    everyone has download all of those packages.

    The main disadcvantage of using shared libraries is that the upstream developers have to be careful; about maintaining ABI backwards
    compatibility, which seems to be a lost art amonsg newer, more
    "hipper" languages such as Go and Rust, who have completely given up
    on this. But for people who can be careful, the advantages to their
    users can be huge.

    - Ted

    --- SoupGate-Win32 v1.05
    * Origin: fsxNet Usenet Gateway (21:1/5)