04dcarraher's forum posts

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

47

Followers

Reviews: 2

User Lists: 0

#1  Edited By 04dcarraher
Member since 2004 • 23832 Posts

@sew333:

Just keep it in mind.

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

47

Followers

Reviews: 2

User Lists: 0

#2  Edited By 04dcarraher
Member since 2004 • 23832 Posts

@sew333:

Motherboard makers and even intel have been pushing the 13900's and 14900's to brink of be unstable by pushing more power to increase clock rates. If your seeing crashes, and errors like out of vram. It could be the culprit. It wouldn't hurt to limit the power limit to save your cpu's lifespan and stability.

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

47

Followers

Reviews: 2

User Lists: 0

#3 04dcarraher
Member since 2004 • 23832 Posts

@sew333:

Turn off all motherboard cpu boosting in bios and only use intel's "stock" profile that limits the cpu to 253w.

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

47

Followers

Reviews: 2

User Lists: 0

#4 04dcarraher
Member since 2004 • 23832 Posts
@cyprusx said:

@BassMan:

according to these 2 Tech videos you can still get noise if the rad if its above the cpu but its still performs well, unless i misunderstood. I'm not talking about performance of the cooler only the noise issue.

https://www.youtube.com/watch?v=BbGomv195sk&ab_channel=GamersNexus

https://www.youtube.com/watch?v=DKwA7ygTJn0&t=361s&ab_channel=JayzTwoCents

The slightly higher noise output is caused from fan placement from the rad in the case. But the reason why you are not seeing a real difference between a 360mm vs 420mm rad is that your heat input is not exceeding the cooling potential of neither one.

The only way you would notice the cooling performance between the two is if you had more heat into the loop saturating the aio's. Also another issue is the design of newer cpu's with their cores being smaller concentrating heat into an area of the heat spreader not transferring the heat as efficiently as cores with a larger surface area.

So grab the aio that has the better warranty and or cheaper.

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

47

Followers

Reviews: 2

User Lists: 0

#5 04dcarraher
Member since 2004 • 23832 Posts

@hardwenzen:

We do see a lot of developers use more advanced engines too, so we cant be that pessimistic. We will see examples of proper optimization.

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

47

Followers

Reviews: 2

User Lists: 0

#6  Edited By 04dcarraher
Member since 2004 • 23832 Posts
@hardwenzen said:
@04dcarraher said:

Just as of late, there's multiple, and remember, i said upcoming titles made for current gen, not what's available (for the most part). But since you asked, there's Starfield, BG3, Helldivers 2 and now DD2. I am sure there's way more, but those are the ones that came to mind since i am playing three of the four.

Starfield its running a graphics engine that's still based on the original version from 2011. Even though it can load up 12 threads decently enough, But in general Bethesda games have never ran particularly well. BG3 can use 12 threads only really saturates 8 threads though. With BG3 the only act 3 was a real optimization issue for them within the city until awhile back. Helldivers 2 does use up to 16 threads loading 12 threads pretty good. I know a Ryzen 3700x can supply a solid 60+fps experience with a strong enough gpu. Dragon’s Dogma 2 is a mess of a game it only saturates 8 threads of a cpu. Not even an i9 14900k and anything but a RTX 4090 cant give you a full proof 60 fps experience at 4k.

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

47

Followers

Reviews: 2

User Lists: 0

#7  Edited By 04dcarraher
Member since 2004 • 23832 Posts

@hardwenzen:

Ok what do you regard as cpu demanding titles? Because right now today on PC Zen 2 3700/3800x's with gpu's as strong or stronger than the gpu that's going to be in the Pro are able to to provide a 60 FPS experience today with properly coded CPU demanding titles.... Now will every dev start coding to the same quality as the more experienced ones? Most likely not... Will we see titles perform below the fps target? Yea...

But to flat out state that that the Pro wont or cant reach their targets of fps or resolution with cpu demanding titles is just plain wrong. There is too many factors that determine the outcome, but the cpu hardware itself is not the "limiting" factor for a 60 fps experience.

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

47

Followers

Reviews: 2

User Lists: 0

#8  Edited By 04dcarraher
Member since 2004 • 23832 Posts

@hardwenzen said:

A requirement to have constant 60fps with the same CPU

Dev's are going to have to start using all available cores and threads more diligently. I've seen people pair RX 6900's and 7800XT's with Ryzen 3700x's and with properly multi-threaded games easily can achieved 60 target at 1440p and 4k. Quite a few games still to this day barely make use of 8 threads let alone 12 or 16 threads.

Even games like Alan Wake 2 still has left over coding from Direct x 11, and the game suffers from usage spikes on 1-2 threads(most likely the only two threads feeding the vast majority gpu frame data) while the rest of the cpu threads/cores sits at 10-20% usage. While games ported to PC done my Nixxes have been for the most part properly coded to make use of upto 16 threads allowing Zen 2 cpus able to handle a 60 fps experience with stronger GPU's. Cyberpunk is another example of good coding where a Ryzen 3800x is able to supply the same average framerate with a 7900xt vs stronger cpu's because the 7900xt with RT is the bottleneck at that point not the cpu per say.

So it can be done.

However the idea that throwing in a GPU that's up to 45% faster in rasterization performance isnt going to magically going to allow a 30 fps game run a 60 fps nor allow a game that runs at 40ish fps to target 60 fps if the game is not leveraging the all the cpu resources to feed frame data to the stronger GPU. Now with the magnitude jump in possible RT performance should allow more effects and more stable frame rates vs PS5. But the problem is that all games still have to designed to work with the lowest common denominator so the differences in games are not going much different as we see with Console vs high end PC with same games.

Sony and Dev's are going to leverage and rely on the new upscaling to get to that 60 fps target.

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

47

Followers

Reviews: 2

User Lists: 0

#9  Edited By 04dcarraher
Member since 2004 • 23832 Posts

Here is an idea pay an electrician to replace the fuse box..... If you dont want to do that, isolate the PC to a room that does not share power a lot of other appliances on the same line to a fuse. Also grabbing an UPS with isolate the PC from surges too and from.

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

47

Followers

Reviews: 2

User Lists: 0

#10 04dcarraher
Member since 2004 • 23832 Posts

@BassMan said:

Underwhelming raw performance boost given the time that has passed since PS5 launched. The addition of AI assisted rendering is much needed though. This bodes well for future AMD GPUs on PC too. FSR just isn't good enough compared to DLSS due to the lack of AI processing.

From what I've read RDNA 4 is suppose to have dedicated AI processors like Tensor cores. So we may see FSR get an AI assisted upscaling/frame gen with RDNA 4 onward. RDNA 3 uses WMMA instructions for AI on their compute unit's.