• Inferior Micro$lop Native Compression

    From Farley Flud@ff@gnulinux.rocks to comp.os.linux.advocacy on Fri Feb 20 12:43:11 2026
    From Newsgroup: comp.os.linux.advocacy

    The native file compression algorithm in that piece
    of junk-shit OS known as Micro$lop Winblows is ZIP:

    <https://en.wikipedia.org/wiki/ZIP_(file_format)>

    The vastly superior GNU/Linux OS natively supports
    several compression methods: GZIP, BZIP2, XZ, LZMA,
    7Z, and ZIP.

    ZIP compression, however, is pure garbage and only
    a demented moron would ever use it. (How appropriate!
    Micro$lop's user base is composed of demented morons.)

    Check out the following list of resulting file sizes
    for each of the above compression methods (thanks
    to ImageMagick: <https://download.imagemagick.org/archive/>)

    ImageMagick-7.1.2-13.7z 10M
    ImageMagick-7.1.2-13.tar.bz2 13M
    ImageMagick-7.1.2-13.tar.gz 15M
    ImageMagick-7.1.2-13.tar.lz 10M
    ImageMagick-7.1.2-13.tar.xz 10M
    ImageMagick-7.1.2-13.zip 18M

    Holy fuckin' moley! ZIP is the worst method by far!

    Note that the TAR archive format also includes a lot
    of extra metadata but even so the resulting file sizes
    beat that junk ZIP by a wide margin.

    So, in conclusion, we clearly see how a garbage compression
    method, ZIP, finds a natural home in a garbage OS, Micro$lop.

    The case is closed.
    --
    Gentoo/LFS: Is there any-fucking-thing else?
    --- Synchronet 3.21b-Linux NewsLink 1.2
  • From Joel W. Crump@joelcrump@gmail.com to comp.os.linux.advocacy on Fri Feb 20 07:55:20 2026
    From Newsgroup: comp.os.linux.advocacy

    On 2/20/26 7:43 AM, Farley Flud wrote:

    The native file compression algorithm in that piece
    of junk-shit OS known as Micro$lop Winblows is ZIP:

    <https://en.wikipedia.org/wiki/ZIP_(file_format)>

    The vastly superior GNU/Linux OS natively supports
    several compression methods: GZIP, BZIP2, XZ, LZMA,
    7Z, and ZIP.

    ZIP compression, however, is pure garbage and only
    a demented moron would ever use it. (How appropriate!
    Micro$lop's user base is composed of demented morons.)

    Check out the following list of resulting file sizes
    for each of the above compression methods (thanks
    to ImageMagick: <https://download.imagemagick.org/archive/>)

    ImageMagick-7.1.2-13.7z 10M
    ImageMagick-7.1.2-13.tar.bz2 13M
    ImageMagick-7.1.2-13.tar.gz 15M
    ImageMagick-7.1.2-13.tar.lz 10M
    ImageMagick-7.1.2-13.tar.xz 10M
    ImageMagick-7.1.2-13.zip 18M

    Holy fuckin' moley! ZIP is the worst method by far!

    Note that the TAR archive format also includes a lot
    of extra metadata but even so the resulting file sizes
    beat that junk ZIP by a wide margin.

    So, in conclusion, we clearly see how a garbage compression
    method, ZIP, finds a natural home in a garbage OS, Micro$lop.

    The case is closed.


    ROTFLMFAO! Dude, you do realize that Win11 handles RAR and a couple
    other things, natively, not just ZIP? Get with the times! ROFL.

    "The case" is still open, as usual, because you are so lost.
    --
    Joel W. Crump
    --- Synchronet 3.21b-Linux NewsLink 1.2
  • From DFS@nospam@dfs.com to comp.os.linux.advocacy on Fri Feb 27 20:15:17 2026
    From Newsgroup: comp.os.linux.advocacy

    On 2/20/2026 7:43 AM, Lameass Larry (aka Farley Flud) wrote:


    The case is closed.

    The case that you're an ignorant nincompoop will never be closed.


    As usual, you haven't been keeping up: Windows added a good variety of
    archive formats and compression methods more than 2 years ago.

    https://imgur.com/a/PQvj1kp

    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From =?UTF-8?Q?St=C3=A9phane?= CARPENTIER@sc@fiat-linux.fr to comp.os.linux.advocacy on Sat Feb 28 11:08:13 2026
    From Newsgroup: comp.os.linux.advocacy

    Le 20-02-2026, Farley Flud <ff@gnulinux.rocks> a |-crit-a:
    Check out the following list of resulting file sizes
    for each of the above compression methods (thanks
    to ImageMagick: <https://download.imagemagick.org/archive/>)

    ImageMagick-7.1.2-13.7z 10M
    ImageMagick-7.1.2-13.tar.bz2 13M
    ImageMagick-7.1.2-13.tar.gz 15M
    ImageMagick-7.1.2-13.tar.lz 10M
    ImageMagick-7.1.2-13.tar.xz 10M
    ImageMagick-7.1.2-13.zip 18M

    Holy fuckin' moley! ZIP is the worst method by far!

    Note that the TAR archive format also includes a lot
    of extra metadata but even so the resulting file sizes
    beat that junk ZIP by a wide margin.

    So, in conclusion, we clearly see how a garbage compression
    method, ZIP, finds a natural home in a garbage OS, Micro$lop.

    The case is closed.

    Once again you proved you understand nothing. When one compares
    compression methods, we don't only take into account the compression
    ratio. You have to look at the time taken to get that ratio. Because
    sometimes, the time taken to get those ratios is more important than the
    final size. So get back to the basics and come back when you'll have
    understood them (which means never, I know). At least, you should be
    able to give the time taken to get those final sizes (well on the same
    machine and it should be an average for the result to be meaningfull,
    but it will be too difficult for you understand the reason behind that).
    --
    Si vous avez du temps |a perdre :
    https://scarpet42.gitlab.io
    --- Synchronet 3.21d-Linux NewsLink 1.2
  • From Farley Flud@ff@gnulinux.rocks to comp.os.linux.advocacy on Sat Feb 28 13:26:14 2026
    From Newsgroup: comp.os.linux.advocacy

    On 28 Feb 2026 11:08:13 GMT, St|-phane CARPENTIER wrote:


    Once again you proved you understand nothing. When one compares
    compression methods, we don't only take into account the compression
    ratio. You have to look at the time taken to get that ratio. Because sometimes, the time taken to get those ratios is more important than the final size.


    Compression time may be a factor, but only for about 1% of cases.

    Most people will compress to reduce storage space or network transmission
    time.

    For anyone who needs to compress billions of files the software
    can be implemented using SIMD, parallelism, hardware clusters, or
    even FPGA.

    For me, and I am sure for most users as well, compression time
    has NEVER been an issue and NEVER will be an issue.






    So get back to the basics and come back when you'll have
    understood them (which means never, I know). At least, you should be
    able to give the time taken to get those final sizes (well on the same machine and it should be an average for the result to be meaningfull,
    but it will be too difficult for you understand the reason behind that).


    Ha, ha, ha, ha, ha! You are funny! Ha, ha, ha, ha, ha!
    --
    Gentoo/LFS: Is there any-fucking-thing else?
    --- Synchronet 3.21d-Linux NewsLink 1.2