new leak about ps4. 14 cu's is recommended balance, not 18 cu's

This topic is locked from further discussion.

Avatar image for XBOunity
XBOunity

3837

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#1 XBOunity
Member since 2013 • 3837 Posts

looks like the system is balanced for 14 cu's. Kinda kills Tormentos since he kept touting 18 cu's all for rendering, but this slide shows otherwise. Dat Jaguar bottleneck is real I guess. Here is the link

http://www.vgleaks.com/playstation-4-balanced-or-unbalanced/

Avatar image for Nengo_Flow
Nengo_Flow

10644

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#2 Nengo_Flow
Member since 2011 • 10644 Posts
why are you so obsess with Playstation so much?
Avatar image for XBOunity
XBOunity

3837

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#3 XBOunity
Member since 2013 • 3837 Posts

4 cu's equate to 410 flops according to slide. tormentos.jpg

Avatar image for XBOunity
XBOunity

3837

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#4 XBOunity
Member since 2013 • 3837 Posts

why are you so obsess with Playstation so much? Nengo_Flow

This is a pretty substantial leak. Read the slide, all that talk about 18 cu's used for rendering and how much power you would get by Tormentos has been put into question. After 14 cu's you get minor improvements, proof is in the pudding. diminishing returns which MSFT has said are actually real were laughed off. Now this is a SONY slide basically pointing to the same thing. Sorry, cows will be dcing for sure.

Avatar image for RR360DD
RR360DD

14099

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#5 RR360DD
Member since 2011 • 14099 Posts

Don't know what this means, but looking forward to all the lies being squashed in the lead up to the launch.

Never trust Sony.

Avatar image for Shensolidus
Shensolidus

931

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#6 Shensolidus
Member since 2003 • 931 Posts

First, understand that the slide you posted (which this topic wasn't TOTALLY just ripped from NeoGAF, btw) has been available for months from VG Leaks, and was written during an era when the PS4 actually contained 4gb of GDDR5; 2012. I think we can all say that something changed since then. ;)

Avatar image for Netherscourge
Netherscourge

16364

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#7 Netherscourge
Member since 2003 • 16364 Posts

...dated 2012?

Avatar image for StrongBlackVine
StrongBlackVine

13262

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#8 StrongBlackVine
Member since 2012 • 13262 Posts

Didn't read, but still considerably more powerful than Xflop.

Avatar image for SaltyMeatballs
SaltyMeatballs

25165

Forum Posts

0

Wiki Points

0

Followers

Reviews: 5

User Lists: 0

#9 SaltyMeatballs
Member since 2009 • 25165 Posts

Kinda kills Tormentos since he kept touting 18 cu's all for rendering, but this slide shows otherwise. XBOunity

18 can be used.

Avatar image for granddogg
granddogg

731

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#10 granddogg
Member since 2006 • 731 Posts
Yep was just about to drop this...some good info
Avatar image for XBOunity
XBOunity

3837

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#11 XBOunity
Member since 2013 • 3837 Posts

First, understand that the slide you posted (which this topic wasn't TOTALLY just ripped from NeoGAF, btw) has been available for months from VG Leaks, and was written during an era when the PS4 actually contained 4gb of GDDR5; 2012. I think we can all say that something changed since then. ;)

Shensolidus

neogaf is a pro sony site. And you are a pro sony fan. This slide says it as clear as day. after 14 cu's you get minor improvement. Its the same diminishing returns that Microsoft has talked about, and that was mentioned for either the boost in the GPU clock or adding the 14 cu's where they saw more of a improvement on the clock even tho the numbers for flops would point to the cu's. Try and spin it all you want.

Avatar image for moistsandwich
moistsandwich

25

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#12 moistsandwich
Member since 2009 • 25 Posts

[QUOTE="Nengo_Flow"]why are you so obsess with Playstation so much? XBOunity

This is a pretty substantial leak. Read the slide, all that talk about 18 cu's used for rendering and how much power you would get by Tormentos has been put into question. After 14 cu's you get minor improvements, proof is in the pudding. diminishing returns which MSFT has said are actually real were laughed off. Now this is a SONY slide basically pointing to the same thing. Sorry, cows will be dcing for sure.

Thats a bit disappointing

Did you hear the news about the XB1? There are no hard numbers for now... but people working with the system have confirmed that different processes are working in the background thus reducing the amount of resources to run games. Who knows how substantial it'll be, but still not good news.

Avatar image for Chutebox
Chutebox

50664

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#13 Chutebox
Member since 2007 • 50664 Posts

First, understand that the slide you posted (which this topic wasn't TOTALLY just ripped from NeoGAF, btw) has been available for months from VG Leaks, and was written during an era when the PS4 actually contained 4gb of GDDR5; 2012. I think we can all say that something changed since then. ;)

Shensolidus
Boom goes the dynamite. OP fails again
Avatar image for XBOunity
XBOunity

3837

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#14 XBOunity
Member since 2013 • 3837 Posts

Yep was just about to drop this...some good infogranddogg

yeah, I think if anything it just adds to the talk out there that after 14 cu's you dont get the same performance. The CPU is bottlenecking it I guess. im sure the other 4 cu's will be used for other tasks like Audio etc.

Avatar image for XBOunity
XBOunity

3837

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#15 XBOunity
Member since 2013 • 3837 Posts

[QUOTE="Shensolidus"]

First, understand that the slide you posted (which this topic wasn't TOTALLY just ripped from NeoGAF, btw) has been available for months from VG Leaks, and was written during an era when the PS4 actually contained 4gb of GDDR5; 2012. I think we can all say that something changed since then. ;)

Chutebox

Boom goes the dynamite. OP fails again

if you only knew what you were talking about. lol the slides say it all. try reading the article instead of invading the thread. ((((Chute))))))

Avatar image for Chutebox
Chutebox

50664

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#16 Chutebox
Member since 2007 • 50664 Posts

[QUOTE="Chutebox"][QUOTE="Shensolidus"]

First, understand that the slide you posted (which this topic wasn't TOTALLY just ripped from NeoGAF, btw) has been available for months from VG Leaks, and was written during an era when the PS4 actually contained 4gb of GDDR5; 2012. I think we can all say that something changed since then. ;)

XBOunity

Boom goes the dynamite. OP fails again

if you only knew what you were talking about. lol the slides say it all. try reading the article instead of invading the thread. ((((Chute))))))

Is Ok to be upset. I would too if I invested so much time and effort into a piece of plastic

Avatar image for Shensolidus
Shensolidus

931

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#17 Shensolidus
Member since 2003 • 931 Posts

[QUOTE="Shensolidus"]

First, understand that the slide you posted (which this topic wasn't TOTALLY just ripped from NeoGAF, btw) has been available for months from VG Leaks, and was written during an era when the PS4 actually contained 4gb of GDDR5; 2012. I think we can all say that something changed since then. ;)

XBOunity

neogaf is a pro sony site. And you are a pro sony fan. This slide says it as clear as day. after 14 cu's you get minor improvement. Its the same diminishing returns that Microsoft has talked about, and that was mentioned for either the boost in the GPU clock or adding the 14 cu's where they saw more of a improvement on the clock even tho the numbers for flops would point to the cu's. Try and spin it all you want.

I'm Sony leaning, but I understand hardware when I see it. 14 CUs is recommended as a 'bottleneck' of sorts due to OTHER hardware limitations. If you move those limitations further along, you alleviate the bottleneck and move it to a new point. More fast RAM can alleviate the bottleneck for 14 CUs, sorry.
Avatar image for XBOunity
XBOunity

3837

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#18 XBOunity
Member since 2013 • 3837 Posts

[QUOTE="XBOunity"]

[QUOTE="Nengo_Flow"]why are you so obsess with Playstation so much? moistsandwich

This is a pretty substantial leak. Read the slide, all that talk about 18 cu's used for rendering and how much power you would get by Tormentos has been put into question. After 14 cu's you get minor improvements, proof is in the pudding. diminishing returns which MSFT has said are actually real were laughed off. Now this is a SONY slide basically pointing to the same thing. Sorry, cows will be dcing for sure.

Thats a bit disappointing

Did you hear the news about the XB1? There are no hard numbers for now... but people working with the system have confirmed that different processes are working in the background thus reducing the amount of resources to run games. Who knows how substantial it'll be, but still not good news.

im not saying that the ps4 isnt more powerful than the XBO. it just seems that cows are unrealistic when it comes to the ps4. anything that suggests otherwise than full potential full usage of hardware is met with absolute craze saying otherwise. just look at neogaf right now, trying sooooo hard they are.

Avatar image for NEWMAHAY
NEWMAHAY

3824

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#19 NEWMAHAY
Member since 2012 • 3824 Posts

Lol at the 2012 date of the slide

 

and

 

"We understand that Sony found that using 4 CUs in GPGPU tasks can be more efficient and with better overall results than using all of them for rendering tasks, but they dont stop developers to use them only for rendering or GPGPU, as they wish depending on the game demands. Its not mandatory at all. Sony is only providing more resources to the developers."

 

from the article.

Avatar image for Deevoshun
Deevoshun

868

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#20 Deevoshun
Member since 2003 • 868 Posts

Jaguar is crap even MS knows this and is probably why they highered the clock on it.

Avatar image for MlauTheDaft
MlauTheDaft

5189

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#21 MlauTheDaft
Member since 2011 • 5189 Posts

The "balanced" part does say 14 CUs + 4 for compute.

Avatar image for Gue1
Gue1

12171

Forum Posts

0

Wiki Points

0

Followers

Reviews: 7

User Lists: 0

#22 Gue1
Member since 2004 • 12171 Posts

dude, they are just comparing the decision of Sony adding 4 more CUs to MS disabling some of them to increase the mhz. You gotta love this self ownage. :lol:

Avatar image for XBOunity
XBOunity

3837

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#23 XBOunity
Member since 2013 • 3837 Posts

Jaguar is crap even MS knows this and is probably why they highered the clock on it.

Deevoshun

i dont get why they both went with this cpu to be honest.

Avatar image for Netherscourge
Netherscourge

16364

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#24 Netherscourge
Member since 2003 • 16364 Posts

Jaguar is crap even MS knows this and is probably why they highered the clock on it.

Deevoshun

You don't need a dedicated CPU for console gaming - the GPU does all the work.

All the CPU will do is load the game into the shared RAM for the GPU shell to handle.

On these consoles, the CPU is used more for UI and non-gaming stuff like streaming, DVR, decompression, etc... the actual game processing will be done with the GPUs.

Avatar image for Chutebox
Chutebox

50664

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#25 Chutebox
Member since 2007 • 50664 Posts

Lol at the 2012 date of the slide

 

and

 

"We understand that Sony found that using 4 CUs in GPGPU tasks can be more efficient and with better overall results than using all of them for rendering tasks, but they dont stop developers to use them only for rendering or GPGPU, as they wish depending on the game demands. Its not mandatory at all. Sony is only providing more resources to the developers."

 

from the article.

NEWMAHAY
Ouch, another hit for PNF.
Avatar image for Shewgenja
Shewgenja

21456

Forum Posts

0

Wiki Points

0

Followers

Reviews: 1

User Lists: 0

#26 Shewgenja
Member since 2009 • 21456 Posts

This would have been before the GPU had 8GBs to address.

Avatar image for XBOunity
XBOunity

3837

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#27 XBOunity
Member since 2013 • 3837 Posts

The "balanced" part does say 14 CUs + 4 for compute.

MlauTheDaft

yes also says minor improvements after 14 cu's, probably due to the bottleneck of CPU, will see a minor improvement. Basically you can still use 18 cu's for rendering, but you are gonna see a drop after 14 cu's, which is what microsoft has suggested. This slide just shows it clear as day. Now you will have cows try and muddy the waters, and try to be alot of confusion, but this is SONYS slide. Neogaf is in a state of panic right now and its laughable, calling ekim all types of names and insinuating all kinds of things. proofs in the pudding.

Avatar image for XBOunity
XBOunity

3837

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#28 XBOunity
Member since 2013 • 3837 Posts

This would have been before the GPU had 8GBs to address.

Shewgenja

lol muddy the waters and try hard. i think thats a stretch on your part. I do believe in that CPU being a bottleneck. they do exist ya know.

Avatar image for Chutebox
Chutebox

50664

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#29 Chutebox
Member since 2007 • 50664 Posts

[QUOTE="MlauTheDaft"]

The "balanced" part does say 14 CUs + 4 for compute.

XBOunity

yes also says minor improvements after 14 cu's, probably due to the bottleneck of CPU, will see a minor improvement. Basically you can still use 18 cu's for rendering, but you are gonna see a drop after 14 cu's, which is what microsoft has suggested. This slide just shows it clear as day. Now you will have cows try and muddy the waters, and try to be alot of confusion, but this is SONYS slide. Neogaf is in a state of panic right now and its laughable, calling ekim all types of names and insinuating all kinds of things. proofs in the pudding.

Bud, you're ignoring literally everyone who proved you wrong. Just abandon now.
Avatar image for XBOunity
XBOunity

3837

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#30 XBOunity
Member since 2013 • 3837 Posts

[QUOTE="NEWMAHAY"]

Lol at the 2012 date of the slide

and

"We understand that Sony found that using 4 CUs in GPGPU tasks can be more efficient and with better overall results than using all of them for rendering tasks, but they dont stop developers to use them only for rendering or GPGPU, as they wish depending on the game demands. Its not mandatory at all. Sony is only providing more resources to the developers."

from the article.

Chutebox

Ouch, another hit for PNF.

after 14 cu's you see a drop, just like microsoft suggested, ouch, chute try and know what you are even debating. [((((((chute))))))))

Avatar image for Deevoshun
Deevoshun

868

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#31 Deevoshun
Member since 2003 • 868 Posts

[QUOTE="Deevoshun"]

Jaguar is crap even MS knows this and is probably why they highered the clock on it.

Netherscourge

You don't need a dedicated CPU for console gaming - the GPU does all the work.

All the CPU will do is load the game into the shared RAM for the GPU shell to handle.

AI, Physics, background tasks and more, will have to be handled by the CPU. These GPUs are low to mid range GPUs they can't do all that much without taking a serious hit. The CPU will then become a bottleneck.
Avatar image for XBOunity
XBOunity

3837

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#32 XBOunity
Member since 2013 • 3837 Posts

[QUOTE="XBOunity"]

[QUOTE="MlauTheDaft"]

The "balanced" part does say 14 CUs + 4 for compute.

Chutebox

yes also says minor improvements after 14 cu's, probably due to the bottleneck of CPU, will see a minor improvement. Basically you can still use 18 cu's for rendering, but you are gonna see a drop after 14 cu's, which is what microsoft has suggested. This slide just shows it clear as day. Now you will have cows try and muddy the waters, and try to be alot of confusion, but this is SONYS slide. Neogaf is in a state of panic right now and its laughable, calling ekim all types of names and insinuating all kinds of things. proofs in the pudding.

Bud, you're ignoring literally everyone who proved you wrong. Just abandon now.

proof is in the slides buddy, i see people stretching for 8 gigs of ram even tho that was never the bottleneck. Minor improvements after 14 cu's.. what did the killzone devs diary show??? 14 cu's being used?

Avatar image for XBOunity
XBOunity

3837

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#33 XBOunity
Member since 2013 • 3837 Posts

[QUOTE="Netherscourge"]

[QUOTE="Deevoshun"]

Jaguar is crap even MS knows this and is probably why they highered the clock on it.

Deevoshun

You don't need a dedicated CPU for console gaming - the GPU does all the work.

All the CPU will do is load the game into the shared RAM for the GPU shell to handle.

AI, Physics, background tasks and more, will have to be handled by the CPU. These GPUs are low to mid range GPUs they can't do all that much without taking a serious hit. The CPU will then become a bottleneck.

at least you get it, and actually believe this weak cpu can be a bottleneck. i wish i could throw these theories at my rig when i dont get the performance i want because of bottlenecks even tho i have a 400 dollar GPU. lol

Avatar image for Chutebox
Chutebox

50664

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#34 Chutebox
Member since 2007 • 50664 Posts

[QUOTE="Chutebox"][QUOTE="NEWMAHAY"]

Lol at the 2012 date of the slide

and

"We understand that Sony found that using 4 CUs in GPGPU tasks can be more efficient and with better overall results than using all of them for rendering tasks, but they dont stop developers to use them only for rendering or GPGPU, as they wish depending on the game demands. Its not mandatory at all. Sony is only providing more resources to the developers."

from the article.

XBOunity

Ouch, another hit for PNF.

after 14 cu's you see a drop, just like microsoft suggested, ouch, chute try and know what you are even debating. [((((((chute))))))))

So now that you were proven wrong you're just saying it's diminishing returns. That's fine, but you were wrong. And posting a slide from a year ago? you're better than that.
Avatar image for XBOunity
XBOunity

3837

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#36 XBOunity
Member since 2013 • 3837 Posts

[QUOTE="XBOunity"]

[QUOTE="Chutebox"] Ouch, another hit for PNF.Chutebox

after 14 cu's you see a drop, just like microsoft suggested, ouch, chute try and know what you are even debating. [((((((chute))))))))

So now that you were proven wrong you're just saying it's diminishing returns. That's fine, but you were wrong. And posting a slide from a year ago? you're better than that.

chute you dont even know what you are arguing. lol. you dont even know what a bottleneck is judging by these replies.

Avatar image for Jakandsigz
Jakandsigz

6341

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#37 Jakandsigz
Member since 2013 • 6341 Posts

[QUOTE="Chutebox"][QUOTE="NEWMAHAY"]

Lol at the 2012 date of the slide

and

"We understand that Sony found that using 4 CUs in GPGPU tasks can be more efficient and with better overall results than using all of them for rendering tasks, but they dont stop developers to use them only for rendering or GPGPU, as they wish depending on the game demands. Its not mandatory at all. Sony is only providing more resources to the developers."

from the article.

XBOunity

Ouch, another hit for PNF.

after 14 cu's you see a drop, just like microsoft suggested, ouch, chute try and know what you are even debating. [((((((chute))))))))

2012???????????
Avatar image for Netherscourge
Netherscourge

16364

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#38 Netherscourge
Member since 2003 • 16364 Posts

[QUOTE="Shewgenja"]

This would have been before the GPU had 8GBs to address.

XBOunity

lol muddy the waters and try hard. i think thats a stretch on your part. I do believe in that CPU being a bottleneck. they do exist ya know.

The CPU is not a bottleneck on a PS4, since the PS4 runs games on a GPU shell and allows low-level API access; similar to what AMD is implementing in PC GPUs in their Mantel initiative. Skipping right over OpenGL/DirectX and going straight into the GPU chipset itself.

If anything, the XB1's CPU will be a bottleneck since it's running on a Windows 8 Kernel and will require DirectX API access to the GPU for game processing.

I don't think many people have realized this yet. Microsoft has, which is why they tried to boost their Jaguar's CPU clock a little to compensate. And then there's all that Kinect stuff that's gotta be processed eating up CPU cycles.

Bottom line: The XB1 is more dependant on it's CPU than the PS4 is.

But hey, keep ignoring the facts and dig up more slides from 2012 if it makes you happy.

;)

Avatar image for Chutebox
Chutebox

50664

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#39 Chutebox
Member since 2007 • 50664 Posts

[QUOTE="Chutebox"][QUOTE="XBOunity"]

after 14 cu's you see a drop, just like microsoft suggested, ouch, chute try and know what you are even debating. [((((((chute))))))))

XBOunity

So now that you were proven wrong you're just saying it's diminishing returns. That's fine, but you were wrong. And posting a slide from a year ago? you're better than that.

chute you dont even know what you are arguing. lol. you dont even know what a bottleneck is judging by these replies.

Riiggghhhtttt. At least you admitted to being wrong. That's a start.
Avatar image for Chutebox
Chutebox

50664

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#40 Chutebox
Member since 2007 • 50664 Posts

[QUOTE="XBOunity"]

[QUOTE="Shewgenja"]

This would have been before the GPU had 8GBs to address.

Netherscourge

lol muddy the waters and try hard. i think thats a stretch on your part. I do believe in that CPU being a bottleneck. they do exist ya know.

The CPU is not a bottleneck on a PS4, since the PS4 runs games on a GPU shell and allows low-level API access; similar to what AMD is implementing in PC GPUs in their Mantel initiative. Skipping right over OpenGL/DirectX and going straight into the GPU chipset itself.

If anything, the XB1's CPU will be a bottleneck since it's running on a Windows 8 Kernel and will require DirectX API access to the GPU for game processing.

I don't think many people have realized this yet. Microsoft has, which is why they tried to boost their Jaguar's CPU clock a little to compensate. And then there's all that Kinect stuff that's gotta be processed eating up CPU cycles.

Bottom line: The XB1 is more dependant on it's CPU than the PS4 is.

But hey, keep ignoring the facts and dig up more slides from 2012 if it makes you happy.

;)

It's what he does.

Avatar image for XBOunity
XBOunity

3837

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#41 XBOunity
Member since 2013 • 3837 Posts

[QUOTE="XBOunity"]

[QUOTE="Chutebox"] Ouch, another hit for PNF.Jakandsigz

after 14 cu's you see a drop, just like microsoft suggested, ouch, chute try and know what you are even debating. [((((((chute))))))))

2012???????????

yep. Dont think much has changed, this is a slide from sony. 14 cu's is recommended for rendering, not 18 like tormentos was saying. lol wonder where that guy is lol that unlimited tap of powah shot to death by Sony themselves. I believe this slide is new, thats why the article and thats why neogaf is crashing with DC. alot of egg on the face on alot of people over there. I guess Microsoft does know what they are doing afterall and them saying you dont get much performance gain from 12 cu's to 14 cu's is real as well. on paper would suggest that the 2 cu's for rendering would do alot better than a 53 mhz gpu boost, the tests showed otherwise. its dat cpu bottleneck at work. These things are real my friend. untappped powah of ps4 :lol:

Avatar image for XBOunity
XBOunity

3837

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#42 XBOunity
Member since 2013 • 3837 Posts

[QUOTE="XBOunity"]

[QUOTE="Shewgenja"]

This would have been before the GPU had 8GBs to address.

Netherscourge

lol muddy the waters and try hard. i think thats a stretch on your part. I do believe in that CPU being a bottleneck. they do exist ya know.

The CPU is not a bottleneck on a PS4, since the PS4 runs games on a GPU shell and allows low-level API access; similar to what AMD is implementing in PC GPUs in their Mantel initiative. Skipping right over OpenGL/DirectX and going straight into the GPU chipset itself.

If anything, the XB1's CPU will be a bottleneck since it's running on a Windows 8 Kernel and will require DirectX API access to the GPU for game processing.

I don't think many people have realized this yet. Microsoft has, which is why they tried to boost their Jaguar's CPU clock a little to compensate. And then there's all that Kinect stuff that's gotta be processed eating up CPU cycles.

Bottom line: The XB1 is more dependant on it's CPU than the PS4 is.

But hey, keep ignoring the facts and dig up more slides from 2012 if it makes you happy.

;)

oh you have done testing on this? would agree with you if the new vgleak didnt say otherwise. also didnt killzone dev diary for shadowfall show only 14 cu's in use suggested by Digital foundry? or they are fanboys too ? Just wondering.

Avatar image for Human-after-all
Human-after-all

2972

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#43 Human-after-all
Member since 2009 • 2972 Posts

looks like the system is balanced for 14 cu's. Kinda kills Tormentos since he kept touting 18 cu's all for rendering, but this slide shows otherwise. Dat Jaguar bottleneck is real I guess. Here is the link

http://www.vgleaks.com/playstation-4-balanced-or-unbalanced/

XBOunity
You can still use 18, it's a developers choice.
Avatar image for timbers_WSU
timbers_WSU

6076

Forum Posts

0

Wiki Points

0

Followers

Reviews: 17

User Lists: 0

#44 timbers_WSU
Member since 2012 • 6076 Posts

First, understand that the slide you posted (which this topic wasn't TOTALLY just ripped from NeoGAF, btw) has been available for months from VG Leaks, and was written during an era when the PS4 actually contained 4gb of GDDR5; 2012. I think we can all say that something changed since then. ;)

Shensolidus

Oh look, it is the so-called game developer!!!! This is typical for Sony. Everyone knows this. Sony never delivers on the final hardware. That does not mean it isn't better than the XB1 when it comes to it's power, it is just Sony's way of doing things.....over promising.

Avatar image for Ribnarak
Ribnarak

2299

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#45 Ribnarak
Member since 2008 • 2299 Posts
yikes self-ownage for the TC. atleast you tried :lol:
Avatar image for XBOunity
XBOunity

3837

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#46 XBOunity
Member since 2013 • 3837 Posts

[QUOTE="XBOunity"]

looks like the system is balanced for 14 cu's. Kinda kills Tormentos since he kept touting 18 cu's all for rendering, but this slide shows otherwise. Dat Jaguar bottleneck is real I guess. Here is the link

http://www.vgleaks.com/playstation-4-balanced-or-unbalanced/

Human-after-all

You can still use 18, it's a developers choice.

yes agreed, but after 14 the performance drops, just like microsoft suggested recently, and just like the slide recommends. It citings minor improvements for rendering after 14 cu's. also looks like 14 cu's for rendering 80 percent ...

http://www.eurogamer.net/articles/digitalfoundry-inside-killzone-shadow-fall

Avatar image for timbers_WSU
timbers_WSU

6076

Forum Posts

0

Wiki Points

0

Followers

Reviews: 17

User Lists: 0

#47 timbers_WSU
Member since 2012 • 6076 Posts

[QUOTE="XBOunity"]

[QUOTE="Chutebox"] Boom goes the dynamite. OP fails againChutebox

if you only knew what you were talking about. lol the slides say it all. try reading the article instead of invading the thread. ((((Chute))))))

Is Ok to be upset. I would too if I invested so much time and effort into a piece of plastic

Your time is spent annoying others about their plastic preference, So how does that make you any better? Talk about games for a change.
Avatar image for XBOunity
XBOunity

3837

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#48 XBOunity
Member since 2013 • 3837 Posts

[QUOTE="Shensolidus"]

First, understand that the slide you posted (which this topic wasn't TOTALLY just ripped from NeoGAF, btw) has been available for months from VG Leaks, and was written during an era when the PS4 actually contained 4gb of GDDR5; 2012. I think we can all say that something changed since then. ;)

timbers_WSU

Oh look, it is the so-called game developer!!!! This is typical for Sony. Everyone knows this. Sony never delivers on the final hardware. That does not mean it isn't better than the XB1 when it comes to it's power, it is just Sony's way of doing things.....over promising.

yes the guy who said the UI of the XBOX one is not doing all the things it is intending to do. Like he has gotten a chance to see the all new features and UI? Dont trust him as a source, the guy is a full fledged cow like all the other people trying to dispute a SLIDE that SONY created.

Avatar image for Netherscourge
Netherscourge

16364

Forum Posts

0

Wiki Points

0

Followers

Reviews: 2

User Lists: 0

#49 Netherscourge
Member since 2003 • 16364 Posts

[QUOTE="Netherscourge"]

[QUOTE="XBOunity"]

lol muddy the waters and try hard. i think thats a stretch on your part. I do believe in that CPU being a bottleneck. they do exist ya know.

XBOunity

The CPU is not a bottleneck on a PS4, since the PS4 runs games on a GPU shell and allows low-level API access; similar to what AMD is implementing in PC GPUs in their Mantel initiative. Skipping right over OpenGL/DirectX and going straight into the GPU chipset itself.

If anything, the XB1's CPU will be a bottleneck since it's running on a Windows 8 Kernel and will require DirectX API access to the GPU for game processing.

I don't think many people have realized this yet. Microsoft has, which is why they tried to boost their Jaguar's CPU clock a little to compensate. And then there's all that Kinect stuff that's gotta be processed eating up CPU cycles.

Bottom line: The XB1 is more dependant on it's CPU than the PS4 is.

But hey, keep ignoring the facts and dig up more slides from 2012 if it makes you happy.

;)

oh you have done testing on this? would agree with you if the new vgleak didnt say otherwise. also didnt killzone dev diary for shadowfall show only 14 cu's in use suggested by Digital foundry? or they are fanboys too ? Just wondering.

I have no idea how many CUs any console is using. All I know is that the XB1 will be bottlenecked more by it's CPU than the PS4 will because of how it's designed to use it's Win8 OS and DX API.

Avatar image for GamingGod999
GamingGod999

3135

Forum Posts

0

Wiki Points

0

Followers

Reviews: 0

User Lists: 0

#50 GamingGod999
Member since 2011 • 3135 Posts

That's from 2012...