Nvidia dominating over the last 10 years? AMD hits a new low with 10% market share.

  • 65 results
  • 1
  • 2
Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

Poll Nvidia dominating over the last 10 years? AMD hits a new low with 10% market share. (19 votes)

Yes. 79%
No. 11%
It's more of a "draw" between the two. 11%

After the recent news that AMD’s GPU market share hit a new low of 10%. Having following the GPU industry over the past 25+ years this the first time I have seen ATI/AMD hit this low, which is sad. But it goes back to my point that isn't it obvious that 10 years of nVIdia dominance has finally come to it’s logical conclusion? If we go back the last 10 years and count the wins vs losses it’s apparent that nVidia has more wins that losses. 10 years ago nVidia launched the GTX 680, AMD countered with the HD 7970 Ghz Edition. It was a toss up between the two with AMD offering 1GB more than the 2GB 680. So, that’s a draw between the two. After the launch of the R9 290X, AMD took back the performance crown with it beating the original Titan at half the price of a Titan. Ironically that was the last High End AMD GPU I had with the XFX R9 390X Double Dissipation. Then I would say nVidia took the next 3 wins with Maxwell, Pascal and Turing all beating AMD’s high end. Not saying AMD didn’t have good GPU’s. The RX 580 was good against the GTX 1060 6GB (coming from 1060 6GB owner and after switching to nVidia 5 years ago), the RX 5700/ XT did good against the RTX 2070 and RTX 2060. But still lost out to the high end, the RTX 2080 Ti or even the 2080. It wasn’t until RX 6900XT that AMD really tied nVidia with it beating the RTX 3090 in Rasterization but losing it to the RTX 3090 in Ray Tracing (that’s something I am readily willing to admit as someone who got a RTX 3090). So, another draw. I am going to wait until to see where the RX 7900 XTX and the 7900 lands as it’s been only two and a half weeks. Apparently, AMD is having it’s engineers work over the holidays as there seems to be some bugs and since it’s AMD’s first Muti-die chip but it looks like it beats the RTX 4080 in Rasterization but loses in Ray Tracing and loses by a lot to the RTX 4090.

According to TechpowerUp though the 7900 XTX can overclock much higher up to 3.0 GHz and can get closer to the RTX 4090. It may be that AMD is waiting for driver fixes (and tweaks to release something like a RDNA 3+ like they did with the original Zen+) next year to get closer to the RTX 4090 and much less price.

Regardless, I am seeing 2 Draws, 1 Wins and 3 Loses for AMD over the past 10 years. So, it’s clear that nVidia is the undisputed winner over the past 10 years. No?

 • 
Avatar image for Pedro
Pedro

70207

Forum Posts

0

Wiki Points

0

Followers

Reviews: 72

User Lists: 0

#1 Pedro
Member since 2002 • 70207 Posts

Oh no! What are we as gamers going to do?

Avatar image for blaznwiipspman1
blaznwiipspman1

16590

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#2 blaznwiipspman1
Member since 2007 • 16590 Posts

@Xtasy26: meh, it's nothing to do with AMD being worse. Even 10 years ago with the 7970, Nvidia took close to 80% marketshare. People are sheep that fall for marketing, and not surprised here either. Personally the way I see it, AMD has to compete with a lower price, which is good for consumers. As long as they profit, then that's fine.

As for Nvidia, they need to be more careful of intel, because if intel actually decides to stick around and keep investing in gpus, I don't see Nvidia dominating any more. Intel just has too much resources and their marketing is best in industry.

Avatar image for blaznwiipspman1
blaznwiipspman1

16590

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#3  Edited By blaznwiipspman1
Member since 2007 • 16590 Posts

Another thing is markershare if what?? Pc marketshare is garbage, nobody cares about pc. Pc revenue from gpu keeps on declining. The only thing that propped it up all these years was crypto. It's all about consoles and AMD has a near 100% marketshare in that segment.

I'd love to hear what Nvidia profit snd revenue is going to be this year and next year.

Avatar image for Juub1990
Juub1990

12620

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#4  Edited By Juub1990
Member since 2013 • 12620 Posts

@Xtasy26: What kind of lie is that? The 6900 XT beat the 3090 at 1080p but who buys $1000 GPUs to game at 1080? It was a tie at 1440p, and the 3090 won at 4K. It mopped the floor with the 6900 XT in ray tracing so it most certainly wasn't a draw. The 6900 XT's saving grace was that it wasn't as disgustingly overpriced as the $1500 3090.

The 780 Ti was also the direct competitor to the 290X and beat it. AMD has taken a series of Ls in the last 10 years and this is why their market share is eroding.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#5  Edited By Xtasy26
Member since 2008 • 5582 Posts
@Juub1990 said:

@Xtasy26: What kind of lie is that? The 6900 XT beat the 3090 at 1080p but who buys $1000 GPUs to game at 1080? It was a tie at 1440p, and the 3090 won at 4K. It mopped the floor with the 6900 XT in ray tracing so it most certainly wasn't a draw. The 6900 XT's saving grace was that it wasn't as disgustingly overpriced as the $1500 3090.

I wasn't talking about 1080P. Why the hell would I get a RTX 3090 and game at 1080P that means nothing to me if the 6900 XT beats my 3090 at 1080P I was gaming at 4K with my 3090. I am just being fair and impartial as possible even as someone who switched sides to nVidia 5 years ago. Funny I got called AMD fan boy on these boards when I was with AMD even though I highlighted their better price/performance. 3090 did lose to rasterization on many benchmarks vs the 6900 XT. I am just being real and honest about it. And also there is the price factor. Although 3090 destroyed the 6900 XT in Ray Tracing but like you stated cost $500 less. So, in a sense I can argue it was more of a draw when everything is considered.

Avatar image for Juub1990
Juub1990

12620

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#6 Juub1990
Member since 2013 • 12620 Posts
@Xtasy26 said:

I wasn't talking about 1080P. Why the hell would I get a RTX 3090 and game at 1080P that means nothing to me if the 6900 XT beats my 3090 at 1080P I was gaming at 4K with my 3090. I am just being fair and impartial as possible even as someone who switched sides to nVidia 5 years ago. Funny I got called AMD fan boy on these boards when I was with AMD even though I highlighted their better price/performance. 3090 did lose to rasterization on many benchmarks vs the 6900 XT. I am just being real and honest about it. And also there is the price factor. Although 3090 destroyed the 6900 XT in Ray Tracing but like you stated cost $500 less. So, in a sense I can argue it was more of a draw when everything is considered.

3DCenter compiled over 7000 benchmarks from across 15 review sites. The 3090 is 9.4% faster than the 6900 XT at 4K. In fact, even the 3080 Ti beats the 6900 XT at 4K.

Source

And the fact that the 6900 XT, a $1000 GPU, was unusable with RT was nothing short of a farce. NVIDIA are a bunch of assholes but as long as AMD keeps playing second fiddle, things will keep worsening.

Avatar image for Jag85
Jag85

19673

Forum Posts

0

Wiki Points

0

Followers

Reviews: 219

User Lists: 0

#7 Jag85
Member since 2005 • 19673 Posts

That's probably why GPU prices are insanely high. Healthy competition helps keep prices down. Lack of competition means Nvidia can get away with charging high prices.

Avatar image for Juub1990
Juub1990

12620

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#8  Edited By Juub1990
Member since 2013 • 12620 Posts

@Jag85: We can thank both NVIDIA and AMD for that.

Avatar image for Jag85
Jag85

19673

Forum Posts

0

Wiki Points

0

Followers

Reviews: 219

User Lists: 0

#9 Jag85
Member since 2005 • 19673 Posts

@Juub1990: I have no idea how much AMD cards cost nowadays. They haven't been relevant since 2010... AMD have become the Sega of GPUs.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#10 Xtasy26
Member since 2008 • 5582 Posts

@Juub1990 said:
@Xtasy26 said:

I wasn't talking about 1080P. Why the hell would I get a RTX 3090 and game at 1080P that means nothing to me if the 6900 XT beats my 3090 at 1080P I was gaming at 4K with my 3090. I am just being fair and impartial as possible even as someone who switched sides to nVidia 5 years ago. Funny I got called AMD fan boy on these boards when I was with AMD even though I highlighted their better price/performance. 3090 did lose to rasterization on many benchmarks vs the 6900 XT. I am just being real and honest about it. And also there is the price factor. Although 3090 destroyed the 6900 XT in Ray Tracing but like you stated cost $500 less. So, in a sense I can argue it was more of a draw when everything is considered.

3DCenter compiled over 7000 benchmarks from across 15 review sites. The 3090 is 9.4% faster than the 6900 XT at 4K. In fact, even the 3080 Ti beats the 6900 XT at 4K.

Source

And the fact that the 6900 XT, a $1000 GPU, was unusable with RT was nothing short of a farce. NVIDIA are a bunch of assholes but as long as AMD keeps playing second fiddle, things will keep worsening.

How many of those benchmarks included Ray Tracing? I was using Hardware Unboxed 12 game benchmark which focused on Rasterization performance with the 6900 XT beating the RTX 3090 (this is from several months ago with the latest drivers):

Source

Again I am saying that 6900 XT offers better value when rasterization and price is considered so some would call it a draw. Clearly I think 3090 is the better product because it offers value to me that others don't consider.

@blaznwiipspman1 said:

@Xtasy26: meh, it's nothing to do with AMD being worse. Even 10 years ago with the 7970, Nvidia took close to 80% marketshare. People are sheep that fall for marketing, and not surprised here either. Personally the way I see it, AMD has to compete with a lower price, which is good for consumers. As long as they profit, then that's fine.

As for Nvidia, they need to be more careful of intel, because if intel actually decides to stick around and keep investing in gpus, I don't see Nvidia dominating any more. Intel just has too much resources and their marketing is best in industry.

It wasn't quite that high. More like 60% when the 7970 Ghz Edition was released.

But I would have to agree. I don't see how AMD can overtake nVidia only thing they can do is compete on price. nVidia is far too ahead in terms of marketing and mindshare that AMD can't catch up. Having them profit and continue the fight is the best they can hope for.

As for intel it would take them several years. I don't see them as a challenger anytime soon. Maybe in 5 years. They will be stuck competing in the low end. From what rumours are their next GPU Battlemage will compete with lower end parts from AMD and nVidia not even the mid-end.

Avatar image for pelvist
pelvist

9001

Forum Posts

0

Wiki Points

0

Followers

Reviews: 23

User Lists: 0

#11 pelvist
Member since 2010 • 9001 Posts

Last AMD/ATI card I owned was the Radeon 9600. I switched to NVIDIA because the graphics driver issues I was having back then were pissing me off. Iv nothign against AMD cards, iv just never had a reason to ditch Nvidia, always been happy with them. The past 5>8 years though I do a lot of work that is sped up by CUDA or relies on it for AI neural networks, couple that with the piss poor performance in RT games in comparison and its very, very easily worth the extra cost to go Nvidia.

Avatar image for deactivated-642321fb121ca
deactivated-642321fb121ca

7142

Forum Posts

0

Wiki Points

0

Followers

Reviews: 20

User Lists: 0

#12 deactivated-642321fb121ca
Member since 2013 • 7142 Posts

6950 XT is a great card, I have always said that AMD can have the best card (They have done in the past), buyers will still prefer nvidia.

The XTX? Well, should have not bullshited the public with cherry picked slides based on a comparison between a new DDR5 rig vs a DDR4 rig.

Avatar image for rmpumper
rmpumper

2149

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#13 rmpumper
Member since 2016 • 2149 Posts

It was the Q before new GPU releases, so I wouldn't look too much into that for now, gotta see Q4 numbers first. Makes no sense either way, seeing as how the 3000 series are still sold above MSRP outside of US, while 6000 is way down compared to release prices.

Avatar image for rmpumper
rmpumper

2149

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#14  Edited By rmpumper
Member since 2016 • 2149 Posts
@Random_Matt said:

buyers will still prefer nvidia.

As the saying goes, people want AMD to have good GPUs just to be able to buy nvidia at a lower price.

Avatar image for Juub1990
Juub1990

12620

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#15 Juub1990
Member since 2013 • 12620 Posts

@Xtasy26: None of those benchmarks I posted in the average uses RT. There is a separate chart for 4K/RT. Hardware Unboxed is also included among the 15 review sites data.

Avatar image for Litchie
Litchie

34731

Forum Posts

0

Wiki Points

0

Followers

Reviews: 13

User Lists: 0

#16 Litchie
Member since 2003 • 34731 Posts

Makes sense. All they're doing is releasing less good GPUs for the same price as Nvidia GPUs. Not the smartest company in the world.

Avatar image for sakaixx
sakaiXx

15984

Forum Posts

0

Wiki Points

0

Followers

Reviews: 8

User Lists: 5

#17 sakaiXx
Member since 2013 • 15984 Posts

Nvidia range is superb mostly. Their ××50 and ××60 range hit the sweetspot for budget PC users while their laptop offering been better than AMD for years until just 2 years ago.

Avatar image for osan0
osan0

17864

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#18 osan0
Member since 2004 • 17864 Posts

Not good but AMD are not exactly helping themselves either. Nvidia left the door wide open this gen and yet AMD found a way to run into the door :S (I say this as a 5800X/6900XT owner).

RDNA 3 is a fine gen on gen improvement over RDNA 2 overall. Some teething issues and it would have been nice to see more of a boost to RT (though it's certainly usable on RDNA3). But the pricing is laughable (the story is this new gen of GPUs really).

RTGs marketing department also need a slap. They keep putting out unrealistic numbers and it gives the media easy ammunition to build hype followed by the inevitable disappointment of it not meeting that hype.

Then you have all the hype about the chiplet design and how it will bring down the cost of manufacturing.......then they still charge stupid money for their GPUs (The 7900XT, which is really the 7800XT, is laughable at it's current price). It's a good idea and it's interesting from a technical perspective, but at the end of the day people don't actually care if you can make something cheaper if there is no benefit to their pocket.

Avatar image for pc_rocks
PC_Rocks

8505

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#19  Edited By PC_Rocks
Member since 2018 • 8505 Posts

They haven't made a decent GPU in almost a decade and completely squandered the opportunity twice now. First with last gen consoles launching with GCN. Imagine the incompetency when you have the same architecture across all gaming machines and yet get your a$$ handed to you by Nvidia. Then the current 4000 series fiasco and still coming way short. Hell Intel put out a competent GPU with RT and ML performance in their first try than AMD.

Avatar image for pc_rocks
PC_Rocks

8505

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#20 PC_Rocks
Member since 2018 • 8505 Posts
@rmpumper said:
@Random_Matt said:

buyers will still prefer nvidia.

As the saying goes, people want AMD to have good GPUs just to be able to buy nvidia at a lower price.

That's just a f**king myth AMD fanboys tell them to feel better otherwise no one would have bought Ryzen over Intel and here we are. And if it's just brand and mindshare then care to tell me why AMD also getting its a$$ whooped in the HPC/Data Center/AI dept.? Surely, those enterprise customers don't care about the brand.

Don't fault buyers for AMD's incompetence in the GPU space.

Avatar image for HalcyonScarlet
HalcyonScarlet

13669

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#21  Edited By HalcyonScarlet
Member since 2011 • 13669 Posts

For me, it's because AMD doesn't get that software performance is the most important thing. Apple get it and I think Nvidia do too.

I don't mind if the competitors GPU is more powerful. If I have a decent GPU with great software support I'm happy. I'd consider AMD graphics cards, but I've heard too many stories about their half arsed drivers and software.

AMD probably thinks it's about marketing and performance.

When I hear "AMD Fine Wine" I just hear that it takes these guys a long time to get the best out of their hardware.

I hear stories about people re-installing older drivers to find the best one. Like their latest driver cuts performance here or there, absolute shit show. What is that? I haven't had to do that for decades. I've only ever needed to install the latest Nvidia driver for their GPUS.

Avatar image for HalcyonScarlet
HalcyonScarlet

13669

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#22 HalcyonScarlet
Member since 2011 • 13669 Posts
@sakaixx said:

Nvidia range is superb mostly. Their ××50 and ××60 range hit the sweetspot for budget PC users while their laptop offering been better than AMD for years until just 2 years ago.

Definitely impressive. I started out on the PC with a GTX 750Ti and upgraded that to GTX 1060 6GBs. These cards (xx50 and xx60) are perfect for the mainstream PC gamers.

Avatar image for blaznwiipspman1
blaznwiipspman1

16590

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#23 blaznwiipspman1
Member since 2007 • 16590 Posts

@Xtasy26: it was well north of 60% my friend, they had at least 70% market share that gen and the 7970 was the superior product, had more memory and was cheaper. Honestly, AMD doesn’t have the resources to compete against nvidia properly, their marketing is just far worse. Nvidia was hyping up gimmicky trash like physx back then and now they’re hyping up some other gimmicky trash like raytracing this gen. Well, at least they know their customers. The only one that can compete in terms of marketing and breaking that mindshare is intel, but they are known quitters. They quit immediately if things don’t go their way. Bunch of sad sacks. But now it looks like they have no choice with the cpu market being more competitive than ever before. Intel should have entered the market back in 2005, heck even 2010, and stayed in it. These guys are a huge disgrace, that’s all I can say.

Avatar image for blaznwiipspman1
blaznwiipspman1

16590

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#24  Edited By blaznwiipspman1
Member since 2007 • 16590 Posts

@HalcyonScarlet: this only has relevance for the pc market and honestly the pc market is small compared to the console market. PC are generally trash, consoles are where it’s at. If you compare, AMD doesn’t have “driver issues” on consoles and things just work out of the box. In fact there’s less driver issues than even the best nvidia card on pc, let that sink in.

For me, I’m good as long as AMD keeps selling their amazing GPUs at a fantastic price to console makers. This is the only thing that matters to me. PC gaming is second rate.

The best part of pc gaming is the steam deck, and heck even it runs on AMD hardware. It runs flawlessly btw.

Avatar image for deactivated-642321fb121ca
deactivated-642321fb121ca

7142

Forum Posts

0

Wiki Points

0

Followers

Reviews: 20

User Lists: 0

#25 deactivated-642321fb121ca
Member since 2013 • 7142 Posts

@HalcyonScarlet said:
@sakaixx said:

Nvidia range is superb mostly. Their ××50 and ××60 range hit the sweetspot for budget PC users while their laptop offering been better than AMD for years until just 2 years ago.

Definitely impressive. I started out on the PC with a GTX 750Ti and upgraded that to GTX 1060 6GBs. These cards (xx50 and xx60) are perfect for the mainstream PC gamers.

You live on La La land if you think this gens 50/60 cards are going to be good.

Avatar image for pc_rocks
PC_Rocks

8505

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#26  Edited By PC_Rocks
Member since 2018 • 8505 Posts
@blaznwiipspman1 said:

@HalcyonScarlet: this only has relevance for the pc market and honestly the pc market is small compared to the console market. PC are generally trash, consoles are where it’s at. If you compare, AMD doesn’t have “driver issues” on consoles and things just work out of the box. In fact there’s less driver issues than even the best nvidia card on pc, let that sink in.

For me, I’m good as long as AMD keeps selling their amazing GPUs at a fantastic price to console makers. This is the only thing that matters to me. PC gaming is second rate.

The best part of pc gaming is the steam deck, and heck even it runs on AMD hardware. It runs flawlessly btw.

Forget Nvidia, Intel even AMD themselves makes less money from consoles than their other segments. The only reason AMD provides chips for Xbox/PS because they couldn't complete in other segments and had to settle for the scraps and also because Nvidia and (probably Intel too) declined.

Avatar image for blaznwiipspman1
blaznwiipspman1

16590

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#27  Edited By blaznwiipspman1
Member since 2007 • 16590 Posts

@pc_rocks: the PC is scraps, nobody cares about pc gaming except a few people who are willing to get bent over and pretend to be cool because of something that looks like an oversized brick. Too much compensation my friend.

I honestly don't care how much money AMD makes off their gpus, as long as they profit from it then that's good. The pc gamers can get scammed by Nvidia...the rest of us will stick to console gaming. Heck yeah!

But anyways, it's going to be interesting to see scamvidias revenues this year, next year and going forward too as intel ramps things up. Fun times.

Avatar image for pc_rocks
PC_Rocks

8505

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#28 PC_Rocks
Member since 2018 • 8505 Posts

@blaznwiipspman1 said:

@pc_rocks: the PC is scraps, nobody cares about pc gaming except a few people who are willing to get bent over and pretend to be cool because of something that looks like an oversized brick. Too much compensation my friend.

I honestly don't care how much money AMD makes off their gpus, as long as they profit from it then that's good. The pc gamers can get scammed by Nvidia...the rest of us will stick to console gaming. Heck yeah!

But anyways, it's going to be interesting to see scamvidias revenues this year, next year and going forward too as intel ramps things up. Fun times.

My dear sweet summer child, console gaming exists because of PC gaming or else you would have transitioned to mobile gaming long before. It's PC market that subsidies the console market, without no investment in consumer CPUs, GPUs, power supplies, RAM you name it. If you ask AMD today to get 10% more share of PC gaming but had to sacrifice console supplies they would accept it in a heartbeat. Didn't you get it when I said that AMD makes more money in the PC space than what they get from the consoles? Besides consoles are themselves the dying.

Well, pretty sure Nvidia's revenues will be in line with the rest of the industry decline, nothing major that you're expecting. And so far Intel has taken share from AMD not Nvidia, just like I predicted. Forget the GPU market, Intel and AMD CPUs would have been in trouble by now if it weren't for MS's incompetency. They both should worry about that first.

Avatar image for HalcyonScarlet
HalcyonScarlet

13669

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#29 HalcyonScarlet
Member since 2011 • 13669 Posts

@blaznwiipspman1 said:

@HalcyonScarlet: this only has relevance for the pc market and honestly the pc market is small compared to the console market. PC are generally trash, consoles are where it’s at. If you compare, AMD doesn’t have “driver issues” on consoles and things just work out of the box. In fact there’s less driver issues than even the best nvidia card on pc, let that sink in.

For me, I’m good as long as AMD keeps selling their amazing GPUs at a fantastic price to console makers. This is the only thing that matters to me. PC gaming is second rate.

The best part of pc gaming is the steam deck, and heck even it runs on AMD hardware. It runs flawlessly btw.

That's because AMD doesn't write the drivers for consoles. Also RDNA 2 struggles with Ray Tracing on consoles, so they're hardly the perfect GPU.

Avatar image for hardwenzen
hardwenzen

39691

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#30 hardwenzen
Member since 2005 • 39691 Posts

@BassMan so i ordered a 3070 (Palit gaming pro) in used condition for $418cad. What's your opinion about my current situation, and what will you do to improve it?

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#31 04dcarraher
Member since 2004 • 23832 Posts

@pc_rocks said:

They haven't made a decent GPU in almost a decade and completely squandered the opportunity twice now. First with last gen consoles launching with GCN. Imagine the incompetency when you have the same architecture across all gaming machines and yet get your a$$ handed to you by Nvidia. Then the current 4000 series fiasco and still coming way short. Hell Intel put out a competent GPU with RT and ML performance in their first try than AMD.

I wouldn't go that far saying they haven't made a decent gpu in a decade.... . GCN with the introduction of the 7000 series introduced Async Compute handled by the onboard hardware scheduler not relying on the cpu(software) . It was a good feature that took Nvidia until Pascal to implement. Then AMD's 290 series was another good competitor vs Nvidia at the time , but the Crypto boom killed the availability and pricing for that series to give people another high end gaming gpu option.

Fact of the matter is that most of AMD's issues was always the software/driver side that held back their hardware. It took them years to fine tune their drivers to maximize their GCN architecture (this is where that "fine wine" term came from). It's not solely because they were working their magic to get more performance out of no where it was because there was performance left on the table from the start and it took time to correct it.

If a new game launched on DX11 AMD didn't have a profile made the driver reverts to a single cpu core to feed the gpu(which kills performance) vs DX11's deferred multithreading. NVIDIA typically does not have that issue.

So, AMD to reduce their work load on their driver team having to create and or patch game profiles for every new game. AMD came up with Mantle API to push a closer to metal hardware coding but also put most of the work onto the developers to code the game to the driver and let onboard hardware handle the scheduling/utilization work.

Now because of DX12 and Vulkan API's we do not see AMD struggling on that front really anymore. Now its all about the feature sets and hardware abilities, where AMD is lagging behind still.

Avatar image for BenjaminBanklin
BenjaminBanklin

11255

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#32 BenjaminBanklin
Member since 2004 • 11255 Posts

@Jag85 said:

That's probably why GPU prices are insanely high. Healthy competition helps keep prices down. Lack of competition means Nvidia can get away with charging high prices.

I don't buy AMD cards, personally. But there always needs to be alternatives in the market. Unchallenged players control the industry how they see fit. 80 cards under 1000 bucks are likely a thing of the past with NVidia now. They don't even need to muscle out competition when they're already strongarming their AIB partners. AMD has a bunch of other ventures they cater to, how much longer do we have before they feign disinterest in PC GPUs due to lack of interest?

Avatar image for BassMan
BassMan

17888

Forum Posts

0

Wiki Points

0

Followers

Reviews: 227

User Lists: 0

#33 BassMan  Online
Member since 2002 • 17888 Posts

@hardwenzen said:

@BassMan so i ordered a 3070 (Palit gaming pro) in used condition for $418cad. What's your opinion about my current situation, and what will you do to improve it?

That is a good deal. Sounds like you are gearing up to play your favourite game.... Halo Infinite.

I recommend upgrading your CPU as well if you want to be hitting higher frame rates.

Avatar image for hardwenzen
hardwenzen

39691

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#34  Edited By hardwenzen
Member since 2005 • 39691 Posts
@BassMan said:
@hardwenzen said:

@BassMan so i ordered a 3070 (Palit gaming pro) in used condition for $418cad. What's your opinion about my current situation, and what will you do to improve it?

That is a good deal. Sounds like you are gearing up to play your favourite game.... Halo Infinite.

I recommend upgrading your CPU as well if you want to be hitting higher frame rates.

Its simply not worth it because of DDR5. Since its new, the prices are insane. I am not paying $200+ before taxes for two sticks of 8gb. Going by benchmarks, the 9600k is still pretty good when paired with a 3070, and that's on stock clocks. Mine is running at 4.8ghz, so it should be even better. Will upgrade the cpu, mobo and ram next blackfriday when new cpu's are released from amd and intel and DDR5 is not a ripoff.

Avatar image for Gatygun
Gatygun

2709

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#35 Gatygun
Member since 2010 • 2709 Posts

Nobody wants AMD for the following reasons:

1) Overpriced

2) nvenc amd has no answer towards it, and basically already makes them for a good decade useless

3) constant driver issue's, and even if they are fixed right now. People have PTSD from it for the years before. Everybody gave AMD a chance at some point, but they burned there bridges harder then light because of shit driver support

4) Endless hardware issue's, even there 7000 series now camp with cooler issue's, before that endless memory problems as they cheap out on hardware.

5) zero support in games, half the time it takes months before even a fix comes out and at that point nobody cares anymore.

6) they drop hardware and when there next new hardware piece is out, they forget about your old card.

7) Way behind with DLSS, now finally starting to catch up with FSR 2 they are still behind. DLSS kinda killed rast performance benchmarks entirely.

8) way behind on RT, even there flag ship 7900xtx they try to vendor for 1k euro's, can't even beat a last generation 3080.

Why spend 1k, when u can't even enable top end features in PC games ( again pricing issue ).

9) low market share = low support in games.

VS Nvidia, yes more expensive but atleast your shit works and that goes a long way to pay the price for that.

AMD PR in the PC space is basically junk tier on the GPU front.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#36 Xtasy26
Member since 2008 • 5582 Posts

@Juub1990 said:

@Xtasy26: None of those benchmarks I posted in the average uses RT. There is a separate chart for 4K/RT. Hardware Unboxed is also included among the 15 review sites data.

I was going by Hardwareunboxed Data. Also, this was using the newer drivers, a lot of those reviews were using older drivers. Generally, AMD/ATI drivers get better over time along with performance.

Avatar image for HalcyonScarlet
HalcyonScarlet

13669

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#37 HalcyonScarlet
Member since 2011 • 13669 Posts

@Random_Matt said:
@HalcyonScarlet said:
@sakaixx said:

Nvidia range is superb mostly. Their ××50 and ××60 range hit the sweetspot for budget PC users while their laptop offering been better than AMD for years until just 2 years ago.

Definitely impressive. I started out on the PC with a GTX 750Ti and upgraded that to GTX 1060 6GBs. These cards (xx50 and xx60) are perfect for the mainstream PC gamers.

You live on La La land if you think this gens 50/60 cards are going to be good.

Didn't comment on that, just said in general these are usually good cards for mainstream gamers. :-S

Avatar image for InEMplease
InEMplease

7461

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#38  Edited By InEMplease
Member since 2009 • 7461 Posts

My mini RTX 2070 super saiyan can still play roller coaster tycoon. Radeon sucks.

Avatar image for Juub1990
Juub1990

12620

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#39 Juub1990
Member since 2013 • 12620 Posts

@Xtasy26: The drivers in your benchmark are older than the ones used by 3DCenter which are from December.The 6900 XT didn’t claw back 10% performance at 4K.

Avatar image for pc_rocks
PC_Rocks

8505

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#40 PC_Rocks
Member since 2018 • 8505 Posts

@04dcarraher said:
@pc_rocks said:

They haven't made a decent GPU in almost a decade and completely squandered the opportunity twice now. First with last gen consoles launching with GCN. Imagine the incompetency when you have the same architecture across all gaming machines and yet get your a$$ handed to you by Nvidia. Then the current 4000 series fiasco and still coming way short. Hell Intel put out a competent GPU with RT and ML performance in their first try than AMD.

I wouldn't go that far saying they haven't made a decent gpu in a decade.... . GCN with the introduction of the 7000 series introduced Async Compute handled by the onboard hardware scheduler not relying on the cpu(software) . It was a good feature that took Nvidia until Pascal to implement. Then AMD's 290 series was another good competitor vs Nvidia at the time , but the Crypto boom killed the availability and pricing for that series to give people another high end gaming gpu option.

Fact of the matter is that most of AMD's issues was always the software/driver side that held back their hardware. It took them years to fine tune their drivers to maximize their GCN architecture (this is where that "fine wine" term came from). It's not solely because they were working their magic to get more performance out of no where it was because there was performance left on the table from the start and it took time to correct it.

If a new game launched on DX11 AMD didn't have a profile made the driver reverts to a single cpu core to feed the gpu(which kills performance) vs DX11's deferred multithreading. NVIDIA typically does not have that issue.

So, AMD to reduce their work load on their driver team having to create and or patch game profiles for every new game. AMD came up with Mantle API to push a closer to metal hardware coding but also put most of the work onto the developers to code the game to the driver and let onboard hardware handle the scheduling/utilization work.

Now because of DX12 and Vulkan API's we do not see AMD struggling on that front really anymore. Now its all about the feature sets and hardware abilities, where AMD is lagging behind still.

They introduced Async Compute with hardware scheduler and Nvidia didn't because their scheduling was much better. AMD was leaving performance on the table, not Nvidia. And benchmarks proved it. Nvidia only implemented it when it became necessary for them too. The fact is AMD continue to trail behind Nvidia in features with their reactionary policy. They haven't managed to do anything Nvidia hasn't done already and also much better.

Avatar image for pc_rocks
PC_Rocks

8505

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#41 PC_Rocks
Member since 2018 • 8505 Posts
@hardwenzen said:
@BassMan said:
@hardwenzen said:

@BassMan so i ordered a 3070 (Palit gaming pro) in used condition for $418cad. What's your opinion about my current situation, and what will you do to improve it?

That is a good deal. Sounds like you are gearing up to play your favourite game.... Halo Infinite.

I recommend upgrading your CPU as well if you want to be hitting higher frame rates.

Its simply not worth it because of DDR5. Since its new, the prices are insane. I am not paying $200+ before taxes for two sticks of 8gb. Going by benchmarks, the 9600k is still pretty good when paired with a 3070, and that's on stock clocks. Mine is running at 4.8ghz, so it should be even better. Will upgrade the cpu, mobo and ram next blackfriday when new cpu's are released from amd and intel and DDR5 is not a ripoff.

Where the f**k are you buying 16GB DDR5 for 200 bucks? In Europe I can easily get 32GB DDR5 for 160-200 and that including VAT. The only really expensive sh*t with Zen4 are the motherboards.

Avatar image for hardwenzen
hardwenzen

39691

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#42 hardwenzen
Member since 2005 • 39691 Posts

@pc_rocks said:
@hardwenzen said:
@BassMan said:
@hardwenzen said:

@BassMan so i ordered a 3070 (Palit gaming pro) in used condition for $418cad. What's your opinion about my current situation, and what will you do to improve it?

That is a good deal. Sounds like you are gearing up to play your favourite game.... Halo Infinite.

I recommend upgrading your CPU as well if you want to be hitting higher frame rates.

Its simply not worth it because of DDR5. Since its new, the prices are insane. I am not paying $200+ before taxes for two sticks of 8gb. Going by benchmarks, the 9600k is still pretty good when paired with a 3070, and that's on stock clocks. Mine is running at 4.8ghz, so it should be even better. Will upgrade the cpu, mobo and ram next blackfriday when new cpu's are released from amd and intel and DDR5 is not a ripoff.

Where the f**k are you buying 16GB DDR5 for 200 bucks? In Europe I can easily get 32GB DDR5 for 160-200 and that including VAT. The only really expensive sh*t with Zen4 are the motherboards.

Its Canadian dollars u noob. They're expensive af, and i don't see myself wasting $250 for ram. And then there's mobos as well. F that.

Avatar image for pc_rocks
PC_Rocks

8505

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#43 PC_Rocks
Member since 2018 • 8505 Posts

@hardwenzen said:
@pc_rocks said:

Where the f**k are you buying 16GB DDR5 for 200 bucks? In Europe I can easily get 32GB DDR5 for 160-200 and that including VAT. The only really expensive sh*t with Zen4 are the motherboards.

Its Canadian dollars u noob. They're expensive af, and i don't see myself wasting $250 for ram. And then there's mobos as well. F that.

All those are 32GB. You quoted the price of 16GB.

Avatar image for hardwenzen
hardwenzen

39691

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#44 hardwenzen
Member since 2005 • 39691 Posts

@pc_rocks said:
@hardwenzen said:
@pc_rocks said:

Where the f**k are you buying 16GB DDR5 for 200 bucks? In Europe I can easily get 32GB DDR5 for 160-200 and that including VAT. The only really expensive sh*t with Zen4 are the motherboards.

Its Canadian dollars u noob. They're expensive af, and i don't see myself wasting $250 for ram. And then there's mobos as well. F that.

All those are 32GB. You quoted the price of 16GB.

There's barely any choice for 16gb, and most models are trash with no reviews.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#45 Xtasy26
Member since 2008 • 5582 Posts

@Juub1990 said:

@Xtasy26: The drivers in your benchmark are older than the ones used by 3DCenter which are from December.The 6900 XT didn’t claw back 10% performance at 4K.

Where exactly are all the compilations are from December? I don't see how each and every benchmarks are from December. The ones from Hardware Unboxed are from several months ago.

Avatar image for Xtasy26
Xtasy26

5582

Forum Posts

0

Wiki Points

0

Followers

Reviews: 53

User Lists: 0

#46  Edited By Xtasy26
Member since 2008 • 5582 Posts
@pelvist said:

Last AMD/ATI card I owned was the Radeon 9600. I switched to NVIDIA because the graphics driver issues I was having back then were pissing me off. Iv nothign against AMD cards, iv just never had a reason to ditch Nvidia, always been happy with them. The past 5>8 years though I do a lot of work that is sped up by CUDA or relies on it for AI neural networks, couple that with the piss poor performance in RT games in comparison and its very, very easily worth the extra cost to go Nvidia.

A lot of applications leverages CUDA. That's part of the reason nVidia is so successful. Almost all the AI, neural networks is run on nVidia hardware. AMD is no where to be seen. As for RT games, me being a single player guy and a general graphics w**re I got to experience Cyberpunk at 4K with Ray Tracing with DLSS Quality settings. That's an experience I wouldn't be able to have with AMD since DLSS 2.3 did a real good job with almost getting native quality graphics. So, in a sense getting the 3090 was worth it. In non-RT games getting AMD or using an older non-ray traced GPU like my GTX 1060 6GB makes sense. But if you want the best graphical fidelity nVidia is the way to go.

@blaznwiipspman1 said:

@Xtasy26: it was well north of 60% my friend, they had at least 70% market share that gen and the 7970 was the superior product, had more memory and was cheaper. Honestly, AMD doesn’t have the resources to compete against nvidia properly, their marketing is just far worse. Nvidia was hyping up gimmicky trash like physx back then and now they’re hyping up some other gimmicky trash like raytracing this gen. Well, at least they know their customers. The only one that can compete in terms of marketing and breaking that mindshare is intel, but they are known quitters. They quit immediately if things don’t go their way. Bunch of sad sacks. But now it looks like they have no choice with the cpu market being more competitive than ever before. Intel should have entered the market back in 2005, heck even 2010, and stayed in it. These guys are a huge disgrace, that’s all I can say.

From Q1-Q4 of 2012 it was between 62-66%. But I nitpick but not quite 70%. But yes the 7970 was the superior product I would argue because of that extra 1GB. And yes they didn't have the resources to compete during the 2012-2017. Those 5 years AMD was putting R&D money into Ryzen. Hence you started to see AMD losing their grasp on the high end. The last win was ironically around that time (with the R9 290X which was already in development) when resources started to shift towards Ryzen and you had that lay offs in their graphics division. You started to see AMD losing in the subsequent generations. R9 Fury X that came afterwards was losing to the 980 Ti and had extra 2GB. AMD made a dumb move by putting liquid cooling and charging it the same as the 980 Ti. I readily admit I brought one on a whim but I got it $50 cheaper than the 980 Ti, although I hardly gamed on it so I wouldn't count that on my list of graphics card I owned as I didn't play through and completed any games on it (didn't have time and sold it). No way would have brought at the same price as the 980 Ti. Fury X should have been $100 cheaper. Then you had the Vega 64 which was using expensive HBM 2 memory and was slightly losing to the GTX 1080 as AMD was trying to use that GPU as a jack of all trades using it's derivates for the Data Center, I am guessing because they didn't have the money to make a dedicated GPU for specifically gaming. Then you had RDNA 1 where it was geared towards gaming but only competed with the RTX 2060 and 2070 but fell short of the 2080 Ti. So, another loss in the high end. So, that's 3 loses in a row after their win with the R9 290X.

intel is disappointing on another level. They were dumb enough to quit after they launched the i740 GPU back in 1998. At least AMD was smart enough to buy ATI in 2006 and their GPU division has been carrying them when their CPU division was not doing well and allowed them to win the consoles contracts which provided steady income. Now, they are trying to get into GPUs but incurred like $3 billion in loses and the A770 barely competes with the RX 6600XT which has better performance and driver support. They are using a die that's the size of 3070 Ti but it's losing to a cheaper GPU that costs less to make. I know drivers are improving but it's safe to say they aren't make money. But you are right they should have still tried with the GPUs even in 2010, their other failed GPU was Larrabee back in 2008. Larrabee wouldn't have been competitive if they had released it back in 2008, it would have lost to the HD 4870 and the GTX 260. But imagine if they had still released GPUs at least in the mid-to low end since 2010, they would have improved drivers for DX 9/10/11 generations and would have gotten more experience in GPU development. Now, they are trying to catch up after 20+ years and taking L's left and right especially with all the money losses. We will see how Battlemage GPU does in the next year or two. I have my doubts after your competitors have been building high-end GPUs or close to high end GPUs for the last 20+ years along with over 20+ years in driver development.

Avatar image for pc_rocks
PC_Rocks

8505

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#47 PC_Rocks
Member since 2018 • 8505 Posts

@hardwenzen said:
@pc_rocks said:

All those are 32GB. You quoted the price of 16GB.

There's barely any choice for 16gb, and most models are trash with no reviews.

I won't recommend going with 16GB anyway unless you want to change it again within a year or may be less. Several games have started listing 16GB as min requirement IIRC, probably overshooting but that gives you an idea.

Avatar image for hardwenzen
hardwenzen

39691

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#48 hardwenzen
Member since 2005 • 39691 Posts

@pc_rocks said:
@hardwenzen said:
@pc_rocks said:

All those are 32GB. You quoted the price of 16GB.

There's barely any choice for 16gb, and most models are trash with no reviews.

I won't recommend going with 16GB anyway unless you want to change it again within a year or may be less. Several games have started listing 16GB as min requirement IIRC, probably overshooting but that gives you an idea.

They do,but do we really need that much? I've heard that they're fake ass requirements

Avatar image for pc_rocks
PC_Rocks

8505

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 5

#49 PC_Rocks
Member since 2018 • 8505 Posts

@hardwenzen said:
@pc_rocks said:
@hardwenzen said:
@pc_rocks said:

All those are 32GB. You quoted the price of 16GB.

There's barely any choice for 16gb, and most models are trash with no reviews.

I won't recommend going with 16GB anyway unless you want to change it again within a year or may be less. Several games have started listing 16GB as min requirement IIRC, probably overshooting but that gives you an idea.

They do,but do we really need that much? I've heard that they're fake ass requirements

Yup but that tells you where they are going in the future (HINT: sh*t optimization). It's pretty much an attempt to normalize it. Either way, you should have a bit of buffer above your VRAM in GPU.

Avatar image for hardwenzen
hardwenzen

39691

Forum Posts

0

Wiki Points

0

Followers

Reviews: 3

User Lists: 0

#50 hardwenzen
Member since 2005 • 39691 Posts

@pc_rocks said:
@hardwenzen said:
@pc_rocks said:
@hardwenzen said:
@pc_rocks said:

All those are 32GB. You quoted the price of 16GB.

There's barely any choice for 16gb, and most models are trash with no reviews.

I won't recommend going with 16GB anyway unless you want to change it again within a year or may be less. Several games have started listing 16GB as min requirement IIRC, probably overshooting but that gives you an idea.

They do,but do we really need that much? I've heard that they're fake ass requirements

Yup but that tells you where they are going in the future (HINT: sh*t optimization). It's pretty much an attempt to normalize it. Either way, you should have a bit of buffer above your VRAM in GPU.

meh i have 16gb of ddr4 and 8gb on the 3070. should be fine for 1440p, and i don't plan on upgrading to 4k until the xx60 tier gpu's can handle it easy-enough.