Is 1.84 TFLOPS a lot for a GPU?

This topic is locked from further discussion.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#151 tormentos
Member since 2003 • 33784 Posts
:lol: so many things wrong go back to sony's teat04dcarraher
The only wrong thing on my post comes from the part i quote.:lol:
Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#152 04dcarraher
Member since 2004 • 23832 Posts

[QUOTE="04dcarraher"] :lol: so many things wrong go back to sony's teattormentos
The only wrong thing on my post comes from the part i quote.:lol:

you wish... The OS plus the features will use more then 512mb.... thats common sense, next the PS4's 18 CU will partitioned normally into 14+4 CU the 4 CU for compute loads which means for main gaming 14CU you will only see typically for pixel pushing, likely the 4CU will be used for physics etc. Next lol, on efficiency turning the tide with a gpu depending on the load design is slower then a 7850 or only slightly faster. As long as you have a faster cpu then whats in the PS4 there will be no overhead to worry about. Which means no matter what they do with the PS4 gpu will never outperform a 7870 or better with same workloads.

Avatar image for CwlHeddwyn
CwlHeddwyn

5314

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#153 CwlHeddwyn
Member since 2005 • 5314 Posts
[QUOTE="tormentos"][QUOTE="DeX2010"] lol. Consoles are very cost-restricted, so what he is saying is that Sony could've gone with 6GB of RAM and gotten improved versions of other components. RAM isn't everything. 6GB of RAM won't leave it "RAM starved". And neither will it bottleneck the system.

Yeah a more powerful GPU that can be build into an APU,without sending the PS4 1 year or more into delay. More power means more heat,more heat mean bigger design and better cooling,bigger design mean more money,better cooling means more money... More power mean higher TDP,higher heat problems. Let see the last time the 360 had 512MB of Ram,on late 2005 most GPU had 256MB of memory,so the xbox double them on that department,by march 2006 just 4 months after the 360 was launch,look at this site comparison of the 360 version of Oblivion vs PC. http://www.gamespot.com/features/the-elder-scrolls-iv-oblivion-xbox-360-versus-pc-6147028/?page=2 The PC version already beating the xbox 360 version of a game on textures and sharpness not to say resolution as well,the 7900GTX which had 512MB of ram,after that the PC did not look back,and with each passing year port of games on 360 suffer more,and on PS3 even more with the 256MB which was already a bit shorter than the 360. In other words what look like ram over kill now 2 years for now will not.

What killed the X360's graphics performance was using a 128-bit databus with the RAM. A comparable PC GPU had a 256-bit databus, so they were operating with a lot more bandwidth.
Avatar image for -Unreal-
-Unreal-

24650

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 1

#154 -Unreal-
Member since 2004 • 24650 Posts

Damn your dumb... tormentos

StarTrekFacePalm.gif.

Avatar image for deactivated-5ec2b2cb7a41e
deactivated-5ec2b2cb7a41e

2058

Forum Posts

0

Wiki Points

0

Followers

Reviews: 15

User Lists: 0

#155 deactivated-5ec2b2cb7a41e
Member since 2008 • 2058 Posts

Pc gamers do not get it... at all. they are living in their own planet.
Optimization people. You can buy the most expensive card but the game itself is not developed for using it as it should be . this is natural and bound to happed as the pc is a  platform with millions of configurations.
It is like wanting to kill a fly with a bazooka. you ll kill that fly but firing bazooka was excessive.

Avatar image for Kinthalis
Kinthalis

5503

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#156 Kinthalis
Member since 2002 • 5503 Posts

Pc gamers do not get it... at all. they are living in their own planet.
Optimization people. You can buy the most expensive card but the game itself is not developed for using it as it should be . this is natural and bound to happed as the pc is a  platform with millions of configurations.
It is like wanting to kill a fly with a bazooka. you ll kill that fly but firing bazooka was excessive.

ioannisdenton

 

You obviously don't anythign about 3D rendering or hardware. So why are you trying to butt into this conversation?

 

OPTIMAZATIONZ!  Is the new cry of the ignorant console gamer.

 

DX11 is a MUCH more optomized API. The difference between optimization on a modern PC and on a modern console is 10-30% at most. 

 

Now, pelase  let the grown ups talk.

Avatar image for MK-Professor
MK-Professor

4214

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#157 MK-Professor
Member since 2009 • 4214 Posts

Pc gamers do not get it... at all. they are living in their own planet.
Optimization people. You can buy the most expensive card but the game itself is not developed for using it as it should be . this is natural and bound to happed as the pc is a  platform with millions of configurations.
It is like wanting to kill a fly with a bazooka. you ll kill that fly but firing bazooka was excessive.

ioannisdenton

console gamers was saying that same thing over and over before the ps3, "ps3 with Optimization is going to beat the 8800GTX:cry:" and what you get is the 8800GTX playing every game with better graphics and performance than ps3. in fact even a prehistoric ati x1950pro(that have similar power with the xbox gpu) play games like crysis 2 with similar graphics and performance with a consoles link

consoles don't run on fairy dust.

Avatar image for stayhigh1
stayhigh1

724

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#158 stayhigh1
Member since 2008 • 724 Posts

PC gamers spend thousand of dollars for upgrades.In fact not all Pc users have the specs to run all those great looking games...The PS4 is a high power end PC..  I think PC gamers is over estimating what 8GB DDR5 of ram can do..

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#159 tormentos
Member since 2003 • 33784 Posts
you wish... The OS plus the features will use more then 512mb.... thats common sense, next the PS4's 18 CU will partitioned normally into 14+4 CU the 4 CU for compute loads which means for main gaming 14CU you will only see typically for pixel pushing, likely the 4CU will be used for physics etc. Next lol, on efficiency turning the tide with a gpu depending on the load design is slower then a 7850 or only slightly faster. As long as you have a faster cpu then whats in the PS4 there will be no overhead to worry about. Which means no matter what they do with the PS4 gpu will never outperform a 7870 or better with same workloads. 04dcarraher
Maybe that is why i say the PS4 could use 1GB for OS,even go as far as to name 2GB for OS which i think is over kill and we all know it,the PS4 is not running windows. Dude you are not READING there is no 14+4 there are 18 unified CU,developer can use 6 for Physics and 12 for rendering,does that mean the 7850 will have more performance.? No because with what the 7850 will emulate those heavy phycis.? OH yeah with those same CU.. Efficiency mean that while the 7870 still waiting because of high latency been idle,the PS4 GPU will be working because of the GPU and CPU been on the same die produce lower latency,AMD equips each HSA APU with two memory controllers, one for the CPU and one for the GPU, that both can access one single DRAM controller,latency kill performance sony knows this,is the reason the console is build that way.
Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#160 04dcarraher
Member since 2004 • 23832 Posts

PC gamers spend thousand of dollars for upgrades.In fact not all Pc users have the specs to run all those great looking games...The PS4 is a high power end PC..  I think PC gamers is over estimating what 8GB DDR5 of ram can do..

stayhigh1
lol, only if you actually knew what you were talking about 8gb will be broken up for OS+features the game and video..... most gaming pc's already have 8gb or more of memory,
Avatar image for LazySloth718
LazySloth718

2345

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#161 LazySloth718
Member since 2011 • 2345 Posts

DX11 is a MUCH more optomized API.

Kinthalis

http://www.bit-tech.net/hardware/graphics/2011/03/16/farewell-to-directx/1

http://www.cgarena.com/archives/interviews/john-carmack/john.php

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#162 tormentos
Member since 2003 • 33784 Posts
[QUOTE="CwlHeddwyn"] What killed the X360's graphics performance was using a 128-bit databus with the RAM. A comparable PC GPU had a 256-bit databus, so they were operating with a lot more bandwidth.

Is one of the things that hurt it,but ram alone the 7900GTX ha as much memory as the complete 36 system,and the memory was the same,it wasn't 5 vs DR3 that would have help the 360 if what the case by having faster bandwidth.
Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#163 04dcarraher
Member since 2004 • 23832 Posts

[QUOTE="04dcarraher"] you wish... The OS plus the features will use more then 512mb.... thats common sense, next the PS4's 18 CU will partitioned normally into 14+4 CU the 4 CU for compute loads which means for main gaming 14CU you will only see typically for pixel pushing, likely the 4CU will be used for physics etc. Next lol, on efficiency turning the tide with a gpu depending on the load design is slower then a 7850 or only slightly faster. As long as you have a faster cpu then whats in the PS4 there will be no overhead to worry about. Which means no matter what they do with the PS4 gpu will never outperform a 7870 or better with same workloads. tormentos
Maybe that is why i say the PS4 could use 1GB for OS,even go as far as to name 2GB for OS which i think is over kill and we all know it,the PS4 is not running windows. Dude you are not READING there is no 14+4 there are 18 unified CU,developer can use 6 for Physics and 12 for rendering,does that mean the 7850 will have more performance.? No because with what the 7850 will emulate those heavy phycis.? OH yeah with those same CU.. Efficiency mean that while the 7870 still waiting because of high latency been idle,the PS4 GPU will be working because of the GPU and CPU been on the same die produce lower latency,AMD equips each HSA APU with two memory controllers, one for the CPU and one for the GPU, that both can access one single DRAM controller,latency kill performance sony knows this,is the reason the console is build that way.

OS plus all the features they mentioned in the conference point to 2gb for OS+feature usage. Also their targeting 14+4CU standard and about the 14 + 4 balance The 4 CUs (410 Gflops) extra ALU as resource for compute, ie physics, dev's could opt out and use all the 18CU just for pixel pushing but that means you will not have all the features and will have a lack of advanced physics available. It's not clear whether this is a single die, or whether it's a pair of dies side-by-side on the same chip. It's more likely to be the former pair of dies side by side. at this stage, with a fully integrated single-die chip coming along in later revisions once chip yields have improved.

Now your latency example is really off, its all about the cpu,cache and its intergrated memory controller. and with memory bandwidth is more important then latency especially for gpu's, system use not so much.  Also pc gpu's handle their own memory and can have a much higher bandwidth which is dependant on cpu supplying the data. which means a 7870 will still outclass then PS4's gpu when you have a good cpu..  

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#164 tormentos
Member since 2003 • 33784 Posts

[QUOTE="Kinthalis"]

DX11 is a MUCH more optomized API.

LazySloth718

http://www.bit-tech.net/hardware/graphics/2011/03/16/farewell-to-directx/1

http://www.cgarena.com/archives/interviews/john-carmack/john.php

One of the biggest problems you have on PC and people don't get,the API just sucks,and coding to the Metal on PC is not an option,is funny when you read some one from Nvidia and from AMD say then same thing,API hold back GPU on PC,on consoles you can get way much more out of your GPU by just going to the Metal which MS doesn't like.
Avatar image for LazySloth718
LazySloth718

2345

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#165 LazySloth718
Member since 2011 • 2345 Posts

[QUOTE="LazySloth718"]

[QUOTE="Kinthalis"]

DX11 is a MUCH more optomized API.

tormentos

http://www.bit-tech.net/hardware/graphics/2011/03/16/farewell-to-directx/1

http://www.cgarena.com/archives/interviews/john-carmack/john.php

One of the biggest problems you have on PC and people don't get,the API just sucks,and coding to the Metal on PC is not an option,is funny when you read some one from Nvidia and from AMD say then same thing,API hold back GPU on PC,on consoles you can get way much more out of your GPU by just going to the Metal which MS doesn't like.

The API doesn't suck

It is necessarily alot of overhead because of open platform and hardware diversity.

Anyway PC's make up for it with raw horsepower (if you spend enough on it)

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#166 04dcarraher
Member since 2004 • 23832 Posts
[QUOTE="LazySloth718"]

[QUOTE="Kinthalis"]

DX11 is a MUCH more optomized API.

tormentos

http://www.bit-tech.net/hardware/graphics/2011/03/16/farewell-to-directx/1

http://www.cgarena.com/archives/interviews/john-carmack/john.php

One of the biggest problems you have on PC and people don't get,the API just sucks,and coding to the Metal on PC is not an option,is funny when you read some one from Nvidia and from AMD say then same thing,API hold back GPU on PC,on consoles you can get way much more out of your GPU by just going to the Metal which MS doesn't like.

lol your goofy and misinformed if modern API's suck then why similar performing gpu on pc vs 360 for example perform nearly equal, if to metal coding is so much better why dont you see current consoles outperforming gpu's that are much stronger with the same workloads.... fact is that modern API's only have a 10-15% resource overhead over to metal coding big whoop. most of the overhead is on the cpu and that all , get a slightly stronger cpu and your precious to metal coding gets thrown out of the window
Avatar image for PC_Otter
PC_Otter

1623

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#167 PC_Otter
Member since 2010 • 1623 Posts
[QUOTE="tormentos"][QUOTE="LazySloth718"]

http://www.bit-tech.net/hardware/graphics/2011/03/16/farewell-to-directx/1

http://www.cgarena.com/archives/interviews/john-carmack/john.php

04dcarraher
One of the biggest problems you have on PC and people don't get,the API just sucks,and coding to the Metal on PC is not an option,is funny when you read some one from Nvidia and from AMD say then same thing,API hold back GPU on PC,on consoles you can get way much more out of your GPU by just going to the Metal which MS doesn't like.

lol your goofy and misinformed if modern API's suck then why similar performing gpu on pc vs 360 for example perform nearly equal, if to metal coding is so much better why dont you see current consoles outperforming gpu's that are much stronger with the same workloads.... fact is that modern API's only have a 10-15% resource overhead over to metal coding big whoop. most of the overhead is on the cpu and that all , get a slightly stronger cpu and your precious to metal coding gets thrown out of the window

This is what I was going to say lol. The CPU is what usually absorbs most of the overhead. Console devs get better at exploiting the hardware, but also use more aggressive LOD schemes too as the systems get older. Still shots are one thing, but a game in motion will easily show the faults of heavy LOD management.
Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#168 tormentos
Member since 2003 • 33784 Posts
OS plus all the features they mentioned in the conference point to 2gb for OS+feature usage. Also their targeting 14+4CU standard and about the 14 + 4 balance The 4 CUs (410 Gflops) extra ALU as resource for compute, ie physics, dev's could opt out and use all the 18CU just for pixel pushing but that means you will not have all the features and will have a lack of advanced physics available. It's not clear whether this is a single die, or whether it's a pair of dies side-by-side on the same chip. It's more likely to be the former pair of dies side by side. at this stage, with a fully integrated single-die chip coming along in later revisions once chip yields have improved.

Now your latency example is really off, its all about the cpu,cache and its intergrated memory controller. and with memory bandwidth is more important then latency especially for gpu's system use not so much.  Also gpu's handle their own memory can can have a much higher bandwidth which is dependant on cpu. which means a 7870 will still outclass then PS4's gpu when you have a good cpu..  

04dcarraher
No it doesn't. The Vita has a horde of apps to run,those even party chat and has a small OS,for the 3rd time the PS4 is not running windows,and several of the features of the PS4 are handle by separate chips,which mean no waist resources. The PS4 is even more closer to the 7870 than the 7850 is,maybe there is were your confusions lies the PS4 is 18CU the 7850 is 16 CU and the 7870 is a 20 CU,with efficiency and without API constrains the PS4 should be right there with the 7870 if not over it. It's funny,' says AMD's worldwide developer relations manager of its GPU division, Richard Huddy. 'We often have at least ten times as much horsepower as an Xbox 360 or a PS3 in a high-end graphics card, yet it's very clear that the games don't look ten times as good. To a significant extent, that's because, one way or another, for good reasons and bad - mostly good, DirectX is getting in the way.' Huddy says that one of the most common requests he gets from game developers is: 'Make the API go away.' ""'I certainly hear this in my conversations with games developers,' he says, 'and I guess it was actually the primary appeal of Larrabee to developers not the hardware, which was hot and slow and unimpressive, but the software being able to have total control over the machine, which is what the very best games developers want. By giving you access to the hardware at the very low level, you give games developers a chance to innovate, and that's going to put pressure on Microsoft no doubt at all.' Of course, there are many definite pros to using a standard 3D API. It's likely that your game will run on a wide range of hardware, and you'll get easy access to the latest shader technologies without having to muck around with scary low-level code. However, the performance overhead of DirectX, particularly on the PC architecture, is apparently becoming a frustrating concern for games developers speaking to AMD. "" ""So what sort of performance-overhead are we talking about here? Is DirectX really that big a barrier to high-speed PC gaming? This, of course, depends on the nature of the game you're developing. 'It can vary from almost nothing at all to a huge overhead,' says Huddy. 'If you're just rendering a screen full of pixels which are not terribly complicated, then typically a PC will do just as good a job as a console. These days we have so much horsepower on PCs that on high-resolutions you see some pretty extraordinary-looking PC games, but one of the things that you don't see in PC gaming inside the software architecture is the kind of stuff that we see on consoles all the time. On consoles, you can draw maybe 10,000 or 20,000 chunks of geometry in a frame, and you can do that at 30-60fps. On a PC, you can't typically draw more than 2-3,000 without getting into trouble with performance, and that's quite surprising - the PC can actually show you only a tenth of the performance if you need a separate batch for each draw call. "" http://www.bit-tech.net/hardware/graphics/2011/03/16/farewell-to-directx/1 Just read dude before you reply,2 person 1 from Nvidia another from AMD,saying the same thing API on PC suck,and while you can have a GPU with ten times the performance on consoles you will not get even close to getting 10 times better looking games,maybe now you under stand why sony spend more on Ram that will be crucial than on GPU... The performance difference between the 7850 and 7870 is minimal,even lower when the topic is PS4 which is in between the 2 hardware. Lets hope you get it this time.
Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#169 04dcarraher
Member since 2004 • 23832 Posts

[QUOTE="04dcarraher"]OS plus all the features they mentioned in the conference point to 2gb for OS+feature usage. Also their targeting 14+4CU standard and about the 14 + 4 balance The 4 CUs (410 Gflops) extra ALU as resource for compute, ie physics, dev's could opt out and use all the 18CU just for pixel pushing but that means you will not have all the features and will have a lack of advanced physics available. It's not clear whether this is a single die, or whether it's a pair of dies side-by-side on the same chip. It's more likely to be the former pair of dies side by side. at this stage, with a fully integrated single-die chip coming along in later revisions once chip yields have improved.

Now your latency example is really off, its all about the cpu,cache and its intergrated memory controller. and with memory bandwidth is more important then latency especially for gpu's system use not so much.  Also gpu's handle their own memory can can have a much higher bandwidth which is dependant on cpu. which means a 7870 will still outclass then PS4's gpu when you have a good cpu..  

tormentos

blah blah blah I think sonys teh best and they do can anything blah blah blah

...:roll:

your proof about API is wrong and has been debunked.

API exist for a reason, they do have overhead, but it makes sure there is a common programming standard across all level of hardware. even if DirectX is removed, there is still graphic drivers, if we remove that, we remove those graphics company's source of income from blind/clueless enthusiastic: multi-GPU.

i see this as a way of trying to make game developers do the hardwork for them, so that they no longer need to provide so frequent driver updates to optimise for games. AMD being lazy....

CPU have a common instruction set: x86. even if software programmers use assembly language to write highly optimised code, they are still issuing x86 Opcode to be decoded by the CPU decoder. And guess what the PS4 cpu is x86 :shock:...

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#170 tormentos
Member since 2003 • 33784 Posts
[QUOTE="tormentos"][QUOTE="LazySloth718"]

http://www.bit-tech.net/hardware/graphics/2011/03/16/farewell-to-directx/1

http://www.cgarena.com/archives/interviews/john-carmack/john.php

04dcarraher
One of the biggest problems you have on PC and people don't get,the API just sucks,and coding to the Metal on PC is not an option,is funny when you read some one from Nvidia and from AMD say then same thing,API hold back GPU on PC,on consoles you can get way much more out of your GPU by just going to the Metal which MS doesn't like.

lol your goofy and misinformed if modern API's suck then why similar performing gpu on pc vs 360 for example perform nearly equal, if to metal coding is so much better why dont you see current consoles outperforming gpu's that are much stronger with the same workloads.... fact is that modern API's only have a 10-15% resource overhead over to metal coding big whoop. most of the overhead is on the cpu and that all , get a slightly stronger cpu and your precious to metal coding gets thrown out of the window

Lets see i quote a link about some one inside AMD which claim Developer of games ask him the most to drop the dam API and you say i am miss inform.? So you mean to tell me that AMD's worldwide developer relations manager doesn't know what the hell he is talking about and that your an average fanboy know more than him.? Wait didn't you did this with Timothy Lottes Nvidia worker that explained why GPU performance is so fu** up and claimed that working to the metal was better.? So 2 different persons who work for the 2 most recognize GPU makers are wrong about API crippling GPU performance,even when one claim the most ask thing to him by developer is that they drop the damn API... You don't know what the hell you are talking about you have been officially owned with links by people who work on the GPU industry.
Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#171 tormentos
Member since 2003 • 33784 Posts
.:roll:

your proof about API is wrong and has been debunked.

API exist for a reason, they do have overhead, but it makes sure there is a common programming standard across all level of hardware. even if DirectX is removed, there is still graphic drivers, if we remove that, we remove those graphics company's source of income from blind/clueless enthusiastic: multi-GPU.

i see this as a way of trying to make game developers do the hardwork for them, so that they no longer need to provide so frequent driver updates to optimise for games. AMD being lazy....

CPU have a common instruction set: x86. even if software programmers use assembly language to write highly optimised code, they are still issuing x86 Opcode to be decoded by the CPU decoder. And guess what the PS4 cpu is x86 :shock:...

04dcarraher
You know what the sad part is troll that this also apply to MS box,in fact there are rumors about MS forcing API on Durango and not allowing for metal programing. Api exist to keep legacy dude nothing more,because we get a barrage of GPU each year in tons of configurations,and if you code to the metal one game for an AMD card chances are that even another AMD card from the same line may not even run,now imaging how it would run on Nvidia ones.
Avatar image for clyde46
clyde46

49061

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#172 clyde46
Member since 2005 • 49061 Posts
ITT: People arguing about things they don't understand.
Avatar image for OneInchMan99
OneInchMan99

1248

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#173 OneInchMan99
Member since 2012 • 1248 Posts

Given the games we've had this gen on the GPU's in consoles I don't think anyone needs to worry about graphics next gen.

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#174 04dcarraher
Member since 2004 • 23832 Posts

[QUOTE="04dcarraher"].:roll:

your proof about API is wrong and has been debunked.

API exist for a reason, they do have overhead, but it makes sure there is a common programming standard across all level of hardware. even if DirectX is removed, there is still graphic drivers, if we remove that, we remove those graphics company's source of income from blind/clueless enthusiastic: multi-GPU.

i see this as a way of trying to make game developers do the hardwork for them, so that they no longer need to provide so frequent driver updates to optimise for games. AMD being lazy....

CPU have a common instruction set: x86. even if software programmers use assembly language to write highly optimised code, they are still issuing x86 Opcode to be decoded by the CPU decoder. And guess what the PS4 cpu is x86 :shock:...

tormentos

You know what the sad part is troll that this also apply to MS box,in fact there are rumors about MS forcing API on Durango and not allowing for metal programing. Api exist to keep legacy dude nothing more,because we get a barrage of GPU each year in tons of configurations,and if you code to the metal one game for an AMD card chances are that even another AMD card from the same line may not even run,now imaging how it would run on Nvidia ones.

lol, API is in PS3 and 360 and all computers its not just for legacy your posts prove you have no idea what your talking about and have your head so far up Sony's butt that you can not see the light of day. as long as you have the cpu to take up the slack of the overhead on pc does not means jack for a gpu performance hit. Which means anyone with a gpu faster then 7850 will be on par if not better then PS4's graphical abilities. PS4 is using pc tweaked hardware so what ever coding they improve will translate into AMD cpu's and gpu's

Avatar image for ShadowDeathX
ShadowDeathX

11698

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#175 ShadowDeathX
Member since 2006 • 11698 Posts

04dcarraher does have a point with the GpGPU thing.

On PC, the video card normally does not have to do much GpGPU computing because CPUs are powerful enough to do it themselves. On PS4, the CPU is weak and developers will most likely than not have to redirect some instructions to be processed by the GPU. That will take some performance away from the GPU.

Sony did put a restriction of ONLY 14CU of the 18CU to be used for graphics processing. From what I heard though, that restriction is gone and developers are free to allocate the CUs to what they please. But again, much more times than not, developers will have to allocate some CUs away from graphics processing. So graphics processing might not even be on par with a PC 7850 (even with the almighty optimizations).

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#176 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="tormentos"][QUOTE="MK-Professor"]no it is not

HD7950 & HD7970 are high end(for single gpu), and the HD7850 & HD7870 are mid range and also all these models are already one year old.

Essentially when the ps4 will be released will have a gpu that it was mid range 2 years ago.

So in the end of the year new GPU's will be released making the HD7850 completely low end.

04dcarraher

There are variables on this equation. The PS4 GPU is between a 7850 and 7870,the GPU it self is on the same die as the CPU,which on PC is not like that unless you are talking about an APU which has no gain paring with a dedicated GPU on PC,which means that the GPU on the PS4 by that alone is more efficient and less prone to latency,which hurt GPU allot on PC. Another thing is coders like the Guy from Nvidia who created an algorithm for AA say,can code to the Metal on PS4,on PC they have to through MS API which doesn't allow that and restrict what developer can do,this will enable coders to get more out of the PS4 GPU that they would from their PC counterpart. Another thing here is,if you remember well when the PS3 was launch,video cards coming out already had as much ram for video as the complete PS3 system had for all system and video,the same with the 360 and even more some shipped with 1GB far eclipsing what the PS3 and xbox 360 had for ram,this time is not the same,the PS4 GPU has 8GB of GDDR5,the very best top of the line GPU now the $1,000+ dollars Gforce Titan has less ram than the PS4,so does the 7990. Now i know the PS4 one is shared but the PS4 OS was say to take 512MB when the memory was rumored to be 4GB,i can see it need it more than 1GB,the PS4 is not running windows,which should ensure that at least on the textures side and things like that the PS4 will not suffer much for quite some time,unlike the PS4 and 360,in fact the 360 a few months after launch already had inferior textures to those found on PC. I am not saying the PS4 will beat a 7970 but it should not suffer as much as the 7850,and since the PS4 is not targeting 2560x1600 with 8XMSAA and 16XAF,and is just targeting 1080p the performance should increase. http://www.anandtech.com/bench/Product/549?vs=550 This should give you a nice idea,everything the 7950 is doing the 7850 does it as well just a tad slower on frames per second.. Look at Crysis Warhead on 2560x1600 just 9 frame difference. Look at Metro once again the same resolution,the difference 10 frames. As soon as the resolution hit 1920X1200 the 7850 get a boost in frames,and that still higher than 1080p.

One variable you totally forgot... Sony is said to only target 14CU of the 18CU for the gaming while the other 4CU will be used for the lcd controller gpgpu workloads, physics etc. Which means the gpu will normally will never match a 7850 performance. 14CU=1.4 TFLOP. Also you totally have the idea of that guy who made FXAA wrong.... his statement was talking about improving texture fetching improvement not the whole gpu's performance, they can not bypass the hardware's physical processing limitations. Also you totally ignored basic computer functions The PS4 will never use 6 or 8gb for video usage, You have to account OS all the features, game data load and then video load into the memory.

Also the PS4 OS and all the features stated will use alot more then 512mb... you will be looking at 2gb OS+features, game will use 4gb(no streaming data) and video use 2gb since anything over 2gb is pointless at 1080.. Also your example of the 7850 vs 7950 is off you should only look at Direct x 11 based games since that will be the new set standard, The 7950 at 1080+ is still much faster then the 7850. Also you should look at dirt3 minimum fps too that gives you the real difference in base line performance.But lets not forget the 7950 boost clock performance or even the 7970 or even crossfire or SLI into the mix by the end of the year the 7850 will be considered mid ranged at best. Heck AMD APU's are going to get igp's nearly as fast as a 7750.

Games like Crysis already has some parts of it's physics running on GPU(1)

1. http://http.developer.nvidia.com/GPUGems3/gpugems3_ch16.html

Vegetation in games has always been mainly static, with some sort of simple bending to give the illusion of wind. Our game scenes can have thousands of different vegetations, but still we pushed the envelope further by making vegetation react to global and local wind sources, and we bend not only the vegetation but also the leaves, in detail, with all computations procedurally and efficiently done on the GPU.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#177 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="ioannisdenton"]

Pc gamers do not get it... at all. they are living in their own planet.
Optimization people. You can buy the most expensive card but the game itself is not developed for using it as it should be . this is natural and bound to happed as the pc is a platform with millions of configurations.
It is like wanting to kill a fly with a bazooka. you ll kill that fly but firing bazooka was excessive.

Kinthalis

You obviously don't anythign about 3D rendering or hardware. So why are you trying to butt into this conversation?

OPTIMAZATIONZ! Is the new cry of the ignorant console gamer.

DX11 is a MUCH more optomized API. The difference between optimization on a modern PC and on a modern console is 10-30% at most.

Now, pelase let the grown ups talk.

While DX11 has improved it's overhead issues, it wouldn't match AMD's HSD driver stack.

http://hsafoundation.com/f-a-q/

Q: Will HSA support Microsoft C++ AMP?

A: HSA will support Microsoft C++ AMP

http://blogs.amd.com/developer/2012/07/10/hsa-%E2%80%93-a-boon-for-opencl%E2%84%A2-and-heterogeneous-compute-in-general/

This HSA driver model runs in Windows.

HSA-OCL-v1.21-graphic-e1341960719282.png

MS is aware of the problem and it's working with AMD to fix it i.e. HSA enabled MS C++ AMP. There are more changes after Windows 8 RTM and DirectX11.1 Level 11.1

Like AMD64 ISA and Window X64 edition, Microsoft basically handed over the future of Windows to AMD (or HSA Foundation). The template for Window core's changes would be coming from Xbox Next i.e. AMD HSA powered box.

Note that Windows RT (ARM edition) would be supporting HSA i.e. most of ARM vendors** are on board HSA. **Minus NVIDIA i.e. NVIDIA is a small fish when it comes to ARM powered devices.

AMD HSD driver stack would be released about the same time as Xbox Next's and PS4's release time period i.e. Q4 2013.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#178 ronvalencia
Member since 2008 • 29612 Posts

[QUOTE="04dcarraher"].:roll:

your proof about API is wrong and has been debunked.

API exist for a reason, they do have overhead, but it makes sure there is a common programming standard across all level of hardware. even if DirectX is removed, there is still graphic drivers, if we remove that, we remove those graphics company's source of income from blind/clueless enthusiastic: multi-GPU.

i see this as a way of trying to make game developers do the hardwork for them, so that they no longer need to provide so frequent driver updates to optimise for games. AMD being lazy....

CPU have a common instruction set: x86. even if software programmers use assembly language to write highly optimised code, they are still issuing x86 Opcode to be decoded by the CPU decoder. And guess what the PS4 cpu is x86 :shock:...

tormentos

You know what the sad part is troll that this also apply to MS box,in fact there are rumors about MS forcing API on Durango and not allowing for metal programing. Api exist to keep legacy dude nothing more,because we get a barrage of GPU each year in tons of configurations,and if you code to the metal one game for an AMD card chances are that even another AMD card from the same line may not even run,now imaging how it would run on Nvidia ones.

A game coded for HSA would run across HSA compatible accelerators.

Lines-of-Code-and-Performance_zpsa23acdf

PS; Current C++ AMP runs on top of DirectCompute. The future C++ AMP would run on top of HSA stack.

Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#179 ronvalencia
Member since 2008 • 29612 Posts

04dcarraher does have a point with the GpGPU thing.

On PC, the video card normally does not have to do much GpGPU computing because CPUs are powerful enough to do it themselves. On PS4, the CPU is weak and developers will most likely than not have to redirect some instructions to be processed by the GPU. That will take some performance away from the GPU.

Sony did put a restriction of ONLY 14CU of the 18CU to be used for graphics processing. From what I heard though, that restriction is gone and developers are free to allocate the CUs to what they please. But again, much more times than not, developers will have to allocate some CUs away from graphics processing. So graphics processing might not even be on par with a PC 7850 (even with the almighty optimizations).

ShadowDeathX

Beisdes CPU side overheads, another problem with DirectX11.1 Level 11.1 is it doesn't expose GCN's non-Direct3D features e.g. X86-64 pointer functions between GCN and CPU.

PS4's APIs are not resticted by Microsoft's DirectX11.1 Level 11.1 standard.

Avatar image for savagetwinkie
savagetwinkie

7981

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#180 savagetwinkie
Member since 2008 • 7981 Posts
[QUOTE="04dcarraher"] you wish... The OS plus the features will use more then 512mb.... thats common sense, next the PS4's 18 CU will partitioned normally into 14+4 CU the 4 CU for compute loads which means for main gaming 14CU you will only see typically for pixel pushing, likely the 4CU will be used for physics etc. Next lol, on efficiency turning the tide with a gpu depending on the load design is slower then a 7850 or only slightly faster. As long as you have a faster cpu then whats in the PS4 there will be no overhead to worry about. Which means no matter what they do with the PS4 gpu will never outperform a 7870 or better with same workloads. tormentos
Maybe that is why i say the PS4 could use 1GB for OS,even go as far as to name 2GB for OS which i think is over kill and we all know it,the PS4 is not running windows. Dude you are not READING there is no 14+4 there are 18 unified CU,developer can use 6 for Physics and 12 for rendering,does that mean the 7850 will have more performance.? No because with what the 7850 will emulate those heavy phycis.? OH yeah with those same CU.. Efficiency mean that while the 7870 still waiting because of high latency been idle,the PS4 GPU will be working because of the GPU and CPU been on the same die produce lower latency,AMD equips each HSA APU with two memory controllers, one for the CPU and one for the GPU, that both can access one single DRAM controller,latency kill performance sony knows this,is the reason the console is build that way.

Also there is 8 cores that can be used for some physics and 18 cu for games. Software still works great for most physics in games, I only see the 4 CU for games that really want to go ape sh!t with the physics.
Avatar image for ronvalencia
ronvalencia

29612

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#181 ronvalencia
Member since 2008 • 29612 Posts

04dcarraher does have a point with the GpGPU thing.

On PC, the video card normally does not have to do much GpGPU computing because CPUs are powerful enough to do it themselves. On PS4, the CPU is weak and developers will most likely than not have to redirect some instructions to be processed by the GPU. That will take some performance away from the GPU.

Sony did put a restriction of ONLY 14CU of the 18CU to be used for graphics processing. From what I heard though, that restriction is gone and developers are free to allocate the CUs to what they please. But again, much more times than not, developers will have to allocate some CUs away from graphics processing. So graphics processing might not even be on par with a PC 7850 (even with the almighty optimizations).

ShadowDeathX

On Cinebench 11.5 benchmark, AMD Jaguar "Temash" @ 1.4Ghz (QC/4T) ~= Intel Core i3-2367M (17 watts) "Sandybridge" (@1.4Ghz. (DC/4T).

AMD "Temash" was running on the laptop DDR3 memory, while PS4 would be running on faster GDDR5 memory (~176GB/s).

When running Cinebench 11.5, AMD A6-1450 "Jaguar/Temash" @1.4Ghz was faster than AMD A8-4555 "Piledriver/Trinity" (QC/4T) @1.6Ghz..

AMD A6-1540 "Jaguar/Temash" @1.4Ghz has ~5 watts TDP, while AMD A8-4555 has 17 watts TDP.

Since some ultra thin laptops has surface mounted memory (following Apple's example), AMD could have a laptop design with surface mounted GDDR5 memory.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#182 tormentos
Member since 2003 • 33784 Posts

04dcarraher does have a point with the GpGPU thing.

On PC, the video card normally does not have to do much GpGPU computing because CPUs are powerful enough to do it themselves. On PS4, the CPU is weak and developers will most likely than not have to redirect some instructions to be processed by the GPU. That will take some performance away from the GPU.

Sony did put a restriction of ONLY 14CU of the 18CU to be used for graphics processing. From what I heard though, that restriction is gone and developers are free to allocate the CUs to what they please. But again, much more times than not, developers will have to allocate some CUs away from graphics processing. So graphics processing might not even be on par with a PC 7850 (even with the almighty optimizations).

ShadowDeathX
No he doesn't Not CPU are equally good,remember test are done on ultra expensive CPU,but not every one has one,and when it comes to coding multi core CPU Consoles >> PC. But let me tell you this the PS4 has a chip to run the OS which is like an ARM lo power one,which mean all the 8 cores of the Jaguar can be use for task such as physics without touching the GPU most games on PC don't pass 3 core and very few even hit 4 cores on PC,yeah thanks to legacy,which consoles don't have to fallow. And no sony doesn't have a restriction if 14CU + 4CU for physics that was debunked when the specs where announce,is 18 unified cores,that can be use as developers fit. By the way you take away 2 CU for physics out of the PS4 and still is basically a 7850,the 7850 has 16 CU the PS4 GPU has 18 CU,so is 8 cores + 2 CU for physics and animation enough for you or you need more.?
Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#183 04dcarraher
Member since 2004 • 23832 Posts

[QUOTE="ShadowDeathX"]

04dcarraher does have a point with the GpGPU thing.

On PC, the video card normally does not have to do much GpGPU computing because CPUs are powerful enough to do it themselves. On PS4, the CPU is weak and developers will most likely than not have to redirect some instructions to be processed by the GPU. That will take some performance away from the GPU.

Sony did put a restriction of ONLY 14CU of the 18CU to be used for graphics processing. From what I heard though, that restriction is gone and developers are free to allocate the CUs to what they please. But again, much more times than not, developers will have to allocate some CUs away from graphics processing. So graphics processing might not even be on par with a PC 7850 (even with the almighty optimizations).

tormentos

  No he doesn't Not CPU are equally good,remember test are done on ultra expensive CPU,but not every one has one,and when it comes to coding multi core CPU Consoles >> PC. But let me tell you this the PS4 has a chip to run the OS which is like an ARM lo power one,which mean all the 8 cores of the Jaguar can be use for task such as physics without touching the GPU most games on PC don't pass 3 core and very few even hit 4 cores on PC,yeah thanks to legacy,which consoles don't have to fallow. And no sony doesn't have a restriction if 14CU + 4CU for physics that was debunked when the specs where announce,is 18 unified cores,that can be use as developers fit. By the way you take away 2 CU for physics out of the PS4 and still is basically a 7850,the 7850 has 16 CU the PS4 GPU has 18 CU,so is 8 cores + 2 CU for physics and animation enough for you or you need more.?

wrong again lol.gif give it up, a 1.6 ghz jag 8 core is not going to beat an intel quad core or any other amd 6 or eight core cpu clocked above 2.4 ghz.  Your just blind and in denial to the facts, that PS4 is not going to be on par with a modern pc with a modern direct x 11 based gpu

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#184 RyviusARC
Member since 2011 • 5708 Posts

[QUOTE="ShadowDeathX"]

04dcarraher does have a point with the GpGPU thing.

On PC, the video card normally does not have to do much GpGPU computing because CPUs are powerful enough to do it themselves. On PS4, the CPU is weak and developers will most likely than not have to redirect some instructions to be processed by the GPU. That will take some performance away from the GPU.

Sony did put a restriction of ONLY 14CU of the 18CU to be used for graphics processing. From what I heard though, that restriction is gone and developers are free to allocate the CUs to what they please. But again, much more times than not, developers will have to allocate some CUs away from graphics processing. So graphics processing might not even be on par with a PC 7850 (even with the almighty optimizations).

tormentos

No he doesn't Not CPU are equally good,remember test are done on ultra expensive CPU,but not every one has one,and when it comes to coding multi core CPU Consoles >> PC. But let me tell you this the PS4 has a chip to run the OS which is like an ARM lo power one,which mean all the 8 cores of the Jaguar can be use for task such as physics without touching the GPU most games on PC don't pass 3 core and very few even hit 4 cores on PC,yeah thanks to legacy,which consoles don't have to fallow. And no sony doesn't have a restriction if 14CU + 4CU for physics that was debunked when the specs where announce,is 18 unified cores,that can be use as developers fit. By the way you take away 2 CU for physics out of the PS4 and still is basically a 7850,the 7850 has 16 CU the PS4 GPU has 18 CU,so is 8 cores + 2 CU for physics and animation enough for you or you need more.?

 

All I know is that the PS4 was based from 7970m and a slightly gimped version of it.

The 7970m performs more closely with the desktop 7850 than the desktop 7870.

I'd say the PS4 should be pretty close to the 7850, the CU count is not a direct indiction of performance.

The PS4 CPU is not exactly a power house even when compared to the years old i5 2500k.

But the PS4 doesn't have the API overhead like PC does and the overhead is experienced on the CPU side.

So you will need a PC CPU more powerful than the PS4 one to run the same games but an i5 2500k will easily be enough for that.

If you have a high end 2010 PC with a higher clocked i5 CPU and an well oced GTX 480 (3gb version would be better) then you are set to play next gen console multiplats.

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#185 04dcarraher
Member since 2004 • 23832 Posts

[QUOTE="tormentos"][QUOTE="ShadowDeathX"]

04dcarraher does have a point with the GpGPU thing.

On PC, the video card normally does not have to do much GpGPU computing because CPUs are powerful enough to do it themselves. On PS4, the CPU is weak and developers will most likely than not have to redirect some instructions to be processed by the GPU. That will take some performance away from the GPU.

Sony did put a restriction of ONLY 14CU of the 18CU to be used for graphics processing. From what I heard though, that restriction is gone and developers are free to allocate the CUs to what they please. But again, much more times than not, developers will have to allocate some CUs away from graphics processing. So graphics processing might not even be on par with a PC 7850 (even with the almighty optimizations).

RyviusARC

No he doesn't Not CPU are equally good,remember test are done on ultra expensive CPU,but not every one has one,and when it comes to coding multi core CPU Consoles >> PC. But let me tell you this the PS4 has a chip to run the OS which is like an ARM lo power one,which mean all the 8 cores of the Jaguar can be use for task such as physics without touching the GPU most games on PC don't pass 3 core and very few even hit 4 cores on PC,yeah thanks to legacy,which consoles don't have to fallow. And no sony doesn't have a restriction if 14CU + 4CU for physics that was debunked when the specs where announce,is 18 unified cores,that can be use as developers fit. By the way you take away 2 CU for physics out of the PS4 and still is basically a 7850,the 7850 has 16 CU the PS4 GPU has 18 CU,so is 8 cores + 2 CU for physics and animation enough for you or you need more.?

 

All I know is that the PS4 was based from 7970m and a slightly gimped version of it.

The 7970m performs more closely with the desktop 7850 than the desktop 7870.

I'd say the PS4 should be pretty close to the 7850, the CU count is not a direct indiction of performance.

The PS4 CPU is not exactly a power house even when compared to the years old i5 2500k.

But the PS4 doesn't have the API overhead like PC does and the overhead is experienced on the CPU side.

So you will need a PC CPU more powerful than the PS4 one to run the same games but an i5 2500k will easily be enough for that.

If you have a high end 2010 PC with a higher clocked i5 CPU and an well oced GTX 480 (3gb version would be better) then you are set to play next gen console multiplats.

Heck you dont even need an icore any AMD 6 or 8 core above 2.6 ghz will perform better then an console 8 core jag based AMD cpu.
Avatar image for BlbecekBobecek
BlbecekBobecek

2949

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#186 BlbecekBobecek
Member since 2006 • 2949 Posts

 

But the PS4 doesn't have the API overhead like PC does and the overhead is experienced on the CPU side.

 

RyviusARC

How is Direct3D API experienced on  CPU side? Do you have any source to prove that?

 

If you have a high end 2010 PC with a higher clocked i5 CPU and an well oced GTX 480 (3gb version would be better) then you are set to play next gen console multiplats.

 

 

RyviusARC

I seriously doubt that. High end 2006 PC is not set to play this (last) gen console multiplats afterall...

Avatar image for RyviusARC
RyviusARC

5708

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#187 RyviusARC
Member since 2011 • 5708 Posts

[QUOTE="RyviusARC"]

 

But the PS4 doesn't have the API overhead like PC does and the overhead is experienced on the CPU side.

 

BlbecekBobecek

How is Direct3D API experienced on  CPU side? Do you have any source to prove that?

 

If you have a high end 2010 PC with a higher clocked i5 CPU and an well oced GTX 480 (3gb version would be better) then you are set to play next gen console multiplats.

 

 

RyviusARC

I seriously doubt that. High end 2006 PC is not set to play this (last) gen console multiplats afterall...

 

Just compare an 8600gt to the 360 in games and you will see they perform similar with the 8600gt sometimes even coming out above.

On the other hand you need a more powerful CPU to run the same games because of this.

 

Also what happened in last gen doesn't influence this gen, that would be a fallacy.

The 360 had unified shader tech before PC GPUs did, without that the 360 would have been really weak.

Unified shaders help in shader intensive games and provide a hefty performance boost.

This gen has nothing special like unified shaders.

Also in 2006 there was the 8800gtx which was more than 3x the power of the consoles.

Avatar image for 04dcarraher
04dcarraher

23832

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#188 04dcarraher
Member since 2004 • 23832 Posts

[QUOTE="RyviusARC"]

 

But the PS4 doesn't have the API overhead like PC does and the overhead is experienced on the CPU side.

 

BlbecekBobecek

How is Direct3D API experienced on  CPU side? Do you have any source to prove that?

 

If you have a high end 2010 PC with a higher clocked i5 CPU and an well oced GTX 480 (3gb version would be better) then you are set to play next gen console multiplats.

 

 

RyviusARC

I seriously doubt that. High end 2006 PC is not set to play this (last) gen console multiplats afterall...

Cpu is what processes everything needed, the time takes the cpu to process the needed data, for example modern API with dx 11 its overhead is very little comared to dx 9 your looking at 15% vs "to metal" all you will need to overcome that 15% is a slightly faster cpu or abit higher clockrate.

also you have to be kidding me that you doubt a high end pc from 2006 cant play current multiplats.... Athlon X2 3ghz or C2D 2.4 ghz, 4gb DDR2 and a 8800GTX done there you go your playing games 2x the graphical ability of current HD consoles

Avatar image for painguy1
painguy1

8686

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#189 painguy1
Member since 2007 • 8686 Posts

Its an AMD chip so to get a rough estimate of how it compares to an Nvidia chip take 1.8/2 so no its not even close to a gtx 580 or anything else u listed.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#190 tormentos
Member since 2003 • 33784 Posts
wrong again lol.gif give it up, a 1.6 ghz jag 8 core is not going to beat an intel quad core or any other amd 6 or eight core cpu clocked above 2.6 ghz. No your just blind and in denial to the fact that they can take those 18CU and set it up where it only uses 14/4 for certain uses. 04dcarraher
""Previous information had suggested that Sony would be splitting GPU resources between rendering and compute functions (VGLeaks suggesting a 14/4 compute unit split between them in its SDK document leak) but the official spec talks of a unified 18 CUs, which "can freely be applied to graphics, simulation tasks, or some mixture of the two". The divide appears to be gone, and devs can apply available power as they see fit."" http://www.eurogamer.net/articles/df-hardware-spec-analysis-playstation-4 STFU there is not 14+4 sh** is 18 Unified CU you have been owned. And second most game are not CPU bound are GPU bound which mean the PS4 will be good,funny last gen console don't have problems with physic and today's port and some how the PS4 will.:lol: Maybe you should stop comparing clock speed is not the 1999 any more,and is not off the chart that a CPU with lower clock with beat one with higher clock speed.
Avatar image for BlbecekBobecek
BlbecekBobecek

2949

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#191 BlbecekBobecek
Member since 2006 • 2949 Posts


 

Just compare an 8600gt to the 360 in games and you will see they perform similar with the 8600gt sometimes even coming out above.

On the other hand you need a more powerful CPU to run the same games because of this.

 

Also what happened in last gen doesn't influence this gen, that would be a fallacy.

The 360 had unified shader tech before PC GPUs did, without that the 360 would have been really weak.

Unified shaders help in shader intensive games and provide a hefty performance boost.

This gen has nothing special like unified shaders.

Also in 2006 there was the 8800gtx which was more than 3x the power of the consoles.

RyviusARC

 

Those 8 gigs of GDDR5 seem pretty special to me even compared to today's high end pcs.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#192 tormentos
Member since 2003 • 33784 Posts
Cpu is what processes everything needed, the time takes the cpu to process the needed data, for example modern API with dx 11 its overhead is very little comared to dx 9 your looking at 15% vs "to metal" all you will need to overcome that 15% is a slightly faster cpu or abit higher clockrate.

also you have to be kidding me that you doubt a high end pc from 2006 cant play current multiplats.... Athlon X2 3ghz or C2D 2.4 ghz, 4gb DDR2 and a 8800GTX done there you go your playing games 2x the graphical ability of current HD consoles

04dcarraher
I already quote 2 person who own your a$$ on that subject both works to the top 2 GPU makers and both claim the same,API Overhead kill performance big time,he name some pretty big examples of it,and you ignore them because it serve you best..
Avatar image for faizan_faizan
faizan_faizan

7869

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#193 faizan_faizan
Member since 2009 • 7869 Posts

[QUOTE="RyviusARC"]


 

Just compare an 8600gt to the 360 in games and you will see they perform similar with the 8600gt sometimes even coming out above.

On the other hand you need a more powerful CPU to run the same games because of this.

 

Also what happened in last gen doesn't influence this gen, that would be a fallacy.

The 360 had unified shader tech before PC GPUs did, without that the 360 would have been really weak.

Unified shaders help in shader intensive games and provide a hefty performance boost.

This gen has nothing special like unified shaders.

Also in 2006 there was the 8800gtx which was more than 3x the power of the consoles.

BlbecekBobecek

 

Those 8 gigs of GDDR5 seem pretty special to me even compared to today's high end pcs.

VRAM isn't everything, And it's shared unlike PC GPU's VRAM which isn't shared it's dedicated for games, The MAIN RAM is.
Avatar image for Tessellation
Tessellation

9297

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#194 Tessellation
Member since 2009 • 9297 Posts
PC GPUS already do over 3.0 TFLOPS and cows are impressed with 1.84 TFLOPS :lol: ? AMD 7990 does 3,788.80 GFLOPS x2 :cool:
Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#195 tormentos
Member since 2003 • 33784 Posts
Those 8 gigs of GDDR5 seem pretty special to me even compared to today's high end pcs.BlbecekBobecek
They ignore what serve them best,the PS4 has more memory than a Gforce Titan which cost damn $1,000+,hell the 7990 is $900 and also has 6GB,in fact out of no where they try to bloat the PS4 OS to 2GB like if the PS4 was running windows,talking crap about its features needing that much ram without knowing,and ignoring the PS4 has separate hardware to cope with several of this features. So if the PC CPU has to handle things on the background while running the game,it has to shift resources the PS4 the is not that way the PS4 has a ARM chip for that,so not even the Jaguar is touch at all,but some how they don't see that either.
Avatar image for faizan_faizan
faizan_faizan

7869

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#196 faizan_faizan
Member since 2009 • 7869 Posts
PC GPUS already do over 3.0 TFLOPS and cows are impressed with 1.84 TFLOPS :lol: ? AMD 7990 does 3,788.80 GFLOPS x2 :cool:Tessellation
And QUAD SLI 690 = 8GB VRAM GDDR5.
Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#197 tormentos
Member since 2003 • 33784 Posts
PC GPUS already do over 3.0 TFLOPS and cows are impressed with 1.84 TFLOPS :lol: ? AMD 7990 does 3,788.80 GFLOPS x2 :cool:Tessellation
Oh that GPU will deliver much better performance than the PS4 i never debated that. By the way with the price of that 7990 i buy a PS4,a Durango xbox live for a year and probably a PS4 game.
Avatar image for Tessellation
Tessellation

9297

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#198 Tessellation
Member since 2009 • 9297 Posts
[QUOTE="BlbecekBobecek"]

[QUOTE="RyviusARC"]


 

Just compare an 8600gt to the 360 in games and you will see they perform similar with the 8600gt sometimes even coming out above.

On the other hand you need a more powerful CPU to run the same games because of this.

 

Also what happened in last gen doesn't influence this gen, that would be a fallacy.

The 360 had unified shader tech before PC GPUs did, without that the 360 would have been really weak.

Unified shaders help in shader intensive games and provide a hefty performance boost.

This gen has nothing special like unified shaders.

Also in 2006 there was the 8800gtx which was more than 3x the power of the consoles.

faizan_faizan

 

Those 8 gigs of GDDR5 seem pretty special to me even compared to today's high end pcs.

VRAM isn't everything, And it's shared unlike PC GPU's VRAM which isn't shared it's dedicated for games, The MAIN RAM is.

even with more ram the UE4 demo on PS4 was downgraded compared to PC version and the GPU was a GTX 680 with 2Gb of RAM :cool:
Avatar image for BlbecekBobecek
BlbecekBobecek

2949

Forum Posts

0

Wiki Points

0

Followers

Reviews: 4

User Lists: 0

#199 BlbecekBobecek
Member since 2006 • 2949 Posts

[QUOTE="BlbecekBobecek"]Those 8 gigs of GDDR5 seem pretty special to me even compared to today's high end pcs.tormentos
They ignore what serve them best,the PS4 has more memory than a Gforce Titan which cost damn $1,000+,hell the 7990 is $900 and also has 6GB,in fact out of no where they try to bloat the PS4 OS to 2GB like if the PS4 was running windows,talking crap about its features needing that much ram without knowing,and ignoring the PS4 has separate hardware to cope with several of this features. So if the PC CPU has to handle things on the background while running the game,it has to shift resources the PS4 the is not that way the PS4 has a ARM chip for that,so not even the Jaguar is touch at all,but some how they don't see that either.

 

They dont want to see it. They hermits master race afterall.

Well Im PC gamer for at least past 20 years, so playing console ports with slightly sharper textures doesnt seem exactly like "ownage" to me. PC gaming had better times long long ago, and those shall most likely never return.

Avatar image for tormentos
tormentos

33784

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#200 tormentos
Member since 2003 • 33784 Posts
[QUOTE="faizan_faizan"] VRAM isn't everything, And it's shared unlike PC GPU's VRAM which isn't shared it's dedicated for games, The MAIN RAM is.

Yes it is the PS4 OS is rumored to be 512MB...that is the only rumor about the OS,to what is that.? 7.5 GB for video.? I say at worst case would be 2GB,but i don't think it will be even close to that,it would probably not pass 1GB.