On Wed, 19 Feb 2025 12:05:02 -0500, in comp.sys.ibm.pc.games.action,
Spalls Hurgenson wrote:
So, Nvidia's not had good luck with their new 5xxx GPUs, have they?
They seem to have done everything they can to make them something
people DON'T want to buy.
From the unimpressive performance boost... to the incredibly high
prices... to the cards being released in such low numbers that the
number of cards sold in an entire country can be counted in
single-digits... to the tremendous power requirements... to the
fucking 12V power connector melting...
Yeah. I read through a thread on Slashdot where a whole bunch of EEs
argued about the connector melts. Some were recommending a higher voltage
rail (up to as much as 40v, IIRC) and fatter and fewer pins that are easy
to seat and have a high tolerance for physical movement. Most EEs agreed
that the connector standard Nvidia is using is a) finicky, and b) being
run with no tolerance level (a 1.1x margin is apparently the *standard*).
No margin does not mean incompetent builders. Any minor deviance in the connection can result in a burn.
There was almost total consensus that the card was drawing far too much amperage off of the 12v rail, and something needed to be done about that.
Some went so far as to suggest an external card with its own enclosure.
On that subject, there was some humorous discussion about plugging it
directly into the wall, for a 120v rail. Apparently, this funny video is becoming less and less far from the truth:
https://www.youtube.com/watch?v=0frNP0qzxQc
So tiny pins + typical PC case wiring and ducking + no margin + high
amperage due to only 12v == really easy to loosen and burn, especially if
you have to connect it to the card more than once or rummage through the
wiring at any point after you've connected the card to the PSU. Even
checking the seating and/or reseating it before closing up, however,
_will_ cause wear that can then cause the issue. It turns out the
connector is also somewhat fragile, cheap, and inconsistently fabbed.
My opinion is that the ATX PSU standard is hopelessly out of date and we
need a full redesign. Even a 24v rail would probably address this
problem. We've been doing Rube Goldberg on 12v for at least a decade.
ATX's original design is 30 years old. 12v is 25 years old. 3.1 is 3
years old. It's time is done. The kludges are failing. I think it's time
to redesign PSUs from scratch, or at least for cutting edge graphics enthusiasts.
In short, while I think an ATX 3.1 supply can handle a 5060 Ti, these
high-end 5080 and 90 cards are just, finally and completely, too power
hungry for current design tolerances. The 5070 too, probably.
If they don't fix this by the Ti designs, Nvdia will suffer irreparable reputational damage in the consumer markets. They'll be an AI company.
I mean, I'm not fan of AMD cards (arguably unfairly) but even I gotta
look askance at an Nvidia card now.
Just don't be an early adopter. AMD can screw the pooch too. After the
4090 fiasco with the same melting connectors, which Nv was supposed to
have reviewed and fixed for the 5090, it's foolish to buy a xx90 card at release. Just no.
We should be especially wary of how any new card is powered.
But look... all the above I could forgive, but this latest bit of news
hurts me where it counts: in my old video games. Because the new 5xxx
chips no longer support 32-bit PhysX in hardware, and now a lot of old >classics just won't run that well anymore.*
[snip]
Suuuuck.
But there were a number of top-tier games that utilized the library, >including "Assassins Creed 4: Black Flag", "Mafia II", "Batman: Arkam >Asylum", and "Metro 2033". PhysX was an integral part of a lot of
classic games.***
You know what I'm thinking about rn? Mirror's Edge. Truly a unique game.
That game is *made* of PhysX. If you don't get PhysX running right, it's
a slideshow.
That said, I'm guessing our current 16-core/32-thread processors can
probably run 32-bit PhysX for most legacy games fast enough in software.
While Mirror's Edge is made of PhysX, Batman Arkham is stuff like waving
capes, chattering teeth, and batarangs. If we need maximum fidelity, we
can run it on period hardware. I doubt it will affect things much.
Or just GPL it. OpenPhysX32 anyone? Not Nvidia's style, but I can hope.
Now, not so much... at least if you want to play those games on
Nvidia's newest GPU. Sure, the games will still run, but they either
won't run as well or will run with reduced Physics effects.
Intellectually, I understand the reasoning: why support this old
32-bit code on the newest processors? But given how a lot of these
games are so beloved (and many are still being sold), and the other
problems the 5xxxx GPUs are suffering, this seems like incredibly bad
timing. Like I said, it's just another reason to avoid buying these
cards.
They can barely keep up with their "game ready" code at each major
release without blowing something up. Old games have always been a crap
shoot with Nvidia drivers because of the way they manage their legacy
code. Sometimes it's even *recent* old games. New cards bring with them
new minimum driver versions. This means more old games will crash and
burn. Same problem. Eventually buying bleeding edge results in a legacy bloodbath.
I think the GTX 660 minimum driver version blew up hardware accel in
Tropico (It'll run but all the textures are a smear. Can't read anything, nothing is recognizable). We lose games all the time that way.
Hopefully, they GPL 32-bit PhysX and let the retrogaming community have
at it. It is Nvidia we're talking about here, though. (*sad face*)
--
Zag
This is csipg.rpg - reality is off topic. ...G. Quinn ('08)
--- SoupGate-Win32 v1.05
* Origin: fsxNet Usenet Gateway (21:1/5)