Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Honestly, I think the game is just bricking a bunch of shit because of how demanding it is anyways. Besides how poorly designed the power delivery is for some of these more recent cards, you've also just got a pretty typical scenario where a big new demanding game is killing hardware that was likely going to die at some point anyways. Most of the CryEngine based games have always done this. ARK killed a lot of stuff as well.reddit thread on cards bricking
Looks Like it's bricking a huge range of cards. Some that I've seen gamers report complete bricking on:
970gtx
980ti
2070
2080ti
3090 (dozens of ftw3 evga, several gigabyte, asus, msi)
vega 56
vega 64
Other than Hardware being bricked, game is causing hard resets on essentially all gpu architectures from 900 series to 3000 series, and vega to 6000 series.
This is absolutely nuts.
Yeah, it pretty much is definite. The driveshaft will break or the diff will scatter. This is why aftermarket driveshafts and diff braces are some of the first mods done to them.There is more than one Hellcat, and no it's not a definite that you will break the driveshaft with drag slicks. That sounds like some story you'd hear from people who abuse their cars like idiots. I recall hearing something similar, but then again that can indeed happen if you go around driving the car like a madman on a track (say jumps).
At this point I think it has enough media attention that anyone is going to report a dead graphics card. Looking at that reddit thread only 1 person with a Vega card a Vega 64 had an issue and he was able to get it working again once it cooled down. These cards are old enough that I bet they have a lot of dust on them. Nothing on that reddit thread about a dead GTX 970 but lots on EVGA 980 Ti's and EVGA 1080's. Why do I feel this is like EVGA has been allowing these cards to run past their power limit for a long time and now this game New World is exposing this practice?reddit thread on cards bricking
Looks Like it's bricking a huge range of cards. Some that I've seen gamers report complete bricking on:
970gtx
980ti
2070
2080ti
3090 (dozens of ftw3 evga, several gigabyte, asus, msi)
vega 56
vega 64
Other than Hardware being bricked, game is causing hard resets on essentially all gpu architectures from 900 series to 3000 series, and vega to 6000 series.
This is absolutely nuts.
Well from tests people have run, it looks like a few things are happening:Honestly, I think the game is just bricking a bunch of shit because of how demanding it is anyways. Besides how poorly designed the power delivery is for some of these more recent cards, you've also just got a pretty typical scenario where a big new demanding game is killing hardware that was likely going to die at some point anyways. Most of the CryEngine based games have always done this. ARK killed a lot of stuff as well.
I think he is saying it's odd the card is close to it's full load as far as wattage goes yet only at 50% gpu load.If you're hitting the set FPS limit in NVCP the card isn't going to stay fully powered/clocked. It only runs what it needs to hit the target.
Yeah, of course, that's how it always works.. except for this game. Look at the clockspeed, temp (which again is the highest i've ever seen this card) and power draw while also at only 50% load.If you're hitting the set FPS limit in NVCP the card isn't going to stay fully powered/clocked. It only runs what it needs to hit the target.
Yup, just seems strange.I think he is saying it's odd the card is close to it's full load as far as wattage goes yet only at 50% gpu load.
Yeah, of course, that's how it always works.. except for this game. Look at the clockspeed, temp (which again is the highest i've ever seen this card) and power draw while also at only 50% load.
Yup, just seems strange.
Yeah, it pretty much is definite. The driveshaft will break or the diff will scatter. This is why aftermarket driveshafts and diff braces are some of the first mods done to them.
they could put in a universal frame cap but then people would bitch....You would think the Nvidia driver would stop this from happening, hopefully they get the glitch fixed shortly.
It only seems to be the increased wattage 3 8pin cards really having an issue. The reference cards really can’t spike that much since there isn’t much more than 375w physically available.Haven't ran into any issues really with a 3090 FE . hits about the same 350 W power use that most games do. only thing that seems to get hot is the memory temp, but that seems to happen in all MMORPG games for some reason (Guild wars 2 gets the memory verry hot as well).
Only, kinda weird spot is it maxes out the card when waiting in the queue line.
Not true. The only physical limit for wattage available is that of the power supply. At some point, it's no longer able to provide additional amps, so the voltage on the output rails drops. The amount of amperage available on each of the 8 pin cables is not limited in any other way, though. Even a card with only two connectors could conceivably draw 500+ watts (that is 42+ amps) from the power supply - it would just exceed the ratings of the cables and connectors, and at some point the resistance through them causes the voltage to droop.It only seems to be the increased wattage 3 8pin cards really having an issue. The reference cards really can’t spike that much since there isn’t much more than 375w physically available.
That's the point, that circuitry seems to be doing the job on the reference design cards. The other cards, not so much.Not true. The only physical limit for wattage available is that of the power supply. At some point, it's no longer able to provide additional amps, so the voltage on the output rails drops. The amount of amperage available on each of the 8 pin cables is not limited in any other way, though. Even a card with only two connectors could conceivably draw 500+ watts (that is 42+ amps) from the power supply - it would just exceed the ratings of the cables and connectors, and at some point the resistance through them causes the voltage to droop.
When current limiting happens, it's on the demand side. Circuitry on the card is actively used to reduce the number of amps that flow through the buck converters and into the logic. It's not necessarily guaranteed that that circuitry will detect and respond to a situation where more current is being drawn than is allowed. There are a few different ways of monitoring current draw, and some of them are more robust than others.
Just a FYI, There is a shitbag going around posting a picture of a blown shunt and trying to pass it off as part of this issue. This is 1000% fake news. That card was sold on ebay as a shunt modded card a month or two ago and the buyer took the block off and put a stock cooler on it and blew the card up and fucked my buddy over and he had to take it back and still has the card. Why this dipshit is doing this im not sure as he doesnt even have that card anymore and has nothing to gain from it other then throwing gasoline on a fire. Only reason im posting this is because its gaining traction for some stupid ass reason.
So if you see this image being passed around as from New World. Its a lie.
View attachment 378278
Your not going to have this issue with a Kingpin. They are built to run at a TDP of 1000w. This game isnt going to overrun the power delivery for that.Been playing the New World Beta on my Kingpin 3090 and so far no problems that i can see with it. I have the display/Visual settings on Ultra/High and i get around 60 FPS or so on my G9 monitor. (5,120 x 1,440 resolution) Average temps on the card are in the mid 50's range.
The orange glow on the RAM and the smoke and fire look scaringly like an EAF running at 45MW! ;-)
Of course it's fake but the proximity of the ram, its color and the flames and smoke remind me of EAF running at full power which I've witnessed in person.That picture was something someone created from a very clever fire animation video. It was only obvious it wasn't real by looking at the speed of the flames.
The reference design is pretty good given the performance with the max of 350w. The issue becomes the designs with increased delivery that will do 400w+. The power use skyrockets for very minimal gains at that point. 400w+ for an extra 5-8% tops over the reference of 350w ain’t worth it IMO, and the spikes really only seem to happen on the non-reference cards.I'm surprised this hasn't been mentioned but how power efficient are the new RTX 3000 cards? Nvidia users were making fun of AMD for using a lot of power but now these cards with a new game can pull enough power to kill themselves. That's nuts.
It may not be a matter of how much power it's pulling but more of a matter of a state of oscillation or hysteresis that puts the power delivery circuits into a viscous feedback to self destruction!I'm surprised this hasn't been mentioned but how power efficient are the new RTX 3000 cards? Nvidia users were making fun of AMD for using a lot of power but now these cards with a new game can pull enough power to kill themselves. That's nuts.
I mean yea but the amount of power these cards can pull seems very inefficient. To me it seems like board partners are trying to differentiate themselves from each other and the reference design by allowing these cards to pull more power than they should. Nowadays they differentiate themselves by winning benchmarks by a few frames but this maybe at the cost of you not knowing that card will kill itself by turning off the governor, while making the device cheaper to manufacturer by putting in less capable components than the reference design. One thing I've learned from watching Actually Hardcore Overclocking is that board partners almost always use inferior VRMs and other components compared to reference with the exception of Asus.It may not be a matter of how much power it's pulling but more of a matter of a state of oscillation or hysteresis that puts the power delivery circuits into a viscous feedback to self destruction!
I bet on the 3090 at least it has something to do with overheating on the backside....my 3090 would about burn me before I added fans blowing on the back before I put it on waterHrmmm all those making fun of my 6900xt.... on this site too!
My 6900xt is not bricking.
While your 3090 well...... rip
Glad I got rid of my evga 3090. AMD has been a winner through abd through for me this round.
I believe Kyle did but wouldn't say what it was other then it was fixedSpeaking of this, did anyone ever figure out what was causing Space Invaders on the 2080 Ti?
so then don't do advanced RMA and just RMA it as usual.EVGA is scalping people very large time for the RMA of these cards, lol.
https://www.techpowerup.com/285288/...er-level-pricing-for-advanced-gpu-rma-program
Might as well gift some leftover PowerLink pieces of fiasco shit with them when the cards come back to their owners. EVGA being EVGA, as usual.
so then don't do advanced RMA and just RMA it as usual.