mastershake575's forum posts

Avatar image for mastershake575
mastershake575

8574

Forum Posts

0

Wiki Points

24

Followers

Reviews: 0

User Lists: 0

#1 mastershake575
Member since 2007 • 8574 Posts

@BassMan said:

2070 Super is good as it stays around the same price as the regular 2070 and the performance boost is decent.

Big fan of the 2070. Review listed it as only 7% slower than 2080 while costing $200-250 less.

I like how AMD will deliever just enough pressure to where the 2070 should stay near MSRP

Avatar image for mastershake575
mastershake575

8574

Forum Posts

0

Wiki Points

24

Followers

Reviews: 0

User Lists: 0

#2 mastershake575
Member since 2007 • 8574 Posts

@sakaixx said:

If leaks are true ain't no way PS5 and shitbox is mid tier pc. Its pretty high end stuff for price of sub $500

Incorrect. If it had released early 2018 then maybe you could argued more on the higherend but reality is the new mid tier hardware coming out this year will be faster than it and then you have a FRESH batch of even faster mid tier hardware coming out next year before the console release.........

@goldenelementxl said:

The 2060 Super is a good lower mid-range GPU. The rumors swirling around now state the PS5 and Xbox whatever are going to hit the GTX 1080 range. The 2060 Super smokes the GTX 1080. Benchmarks

So the RTX 2060 Super, 2070, 2070 Super, 2080, 2080 Super, 2080TI all beat the rumored next gen GPU specs... So will next gen even really be "mid tier" competitive?

Agreed. Not only that but consoles aren't coming out till the end of 2019 so you'll have even faster mid tier hardware coming out next year as well from both AMD and Nvidia + the consoles will struggle neck to neck with even a modern mid tier CPU due to the signfiantly lower clockrates and possible lack of HT

Avatar image for mastershake575
mastershake575

8574

Forum Posts

0

Wiki Points

24

Followers

Reviews: 0

User Lists: 0

#3 mastershake575
Member since 2007 • 8574 Posts

@crimson_v said:

Benchmarks don't lie, maybe what he meant was that when their games are compiled with their engine MT Framework for PC they need a Pentium D 840 to run equivalently to a xenon, if not he was just simply wrong the benchmarks are what they are.

Like I said, I am going to take the word of senior engineer over literally anybody on this board. There is literally no proof that the Xenon was anything below mid tier in 2005 and the Xenos was borderline highend at the time. This will NEVER happen again in a console

Avatar image for mastershake575
mastershake575

8574

Forum Posts

0

Wiki Points

24

Followers

Reviews: 0

User Lists: 0

#4  Edited By mastershake575
Member since 2007 • 8574 Posts

@crimson_v said:

if you want accurate information on tech listen to tech journalists not game developers,

Sorry but i'm not gonna argue with someone who literally programs on the Xenon for a professional living. I'll try to find the exact presentation but the presenation in question was there head engineer giving a presentation for their MT Framework engine. His exact words where it's comparable to a Pentium D 840 which was a good CPU Fall of 2005 (hell in 2009, 4 years after xbox360 launch my pentium D was still running most games on high settings with a mid tier 9600GT which was basically a re-brand of a 2007 mid tier 8600GTS)

Avatar image for mastershake575
mastershake575

8574

Forum Posts

0

Wiki Points

24

Followers

Reviews: 0

User Lists: 0

#5 mastershake575
Member since 2007 • 8574 Posts

@crimson_v said:

1. Capcom is not a tech journalist media group i don't really value their opinion on this subject,

2. . The 7850 was an ok GPU in 2013, i wouldn't call it trash, the 7790 might have been too low end for when the xbox launched.

3. "and 8 threard disadvantage" do you reckon they'll go with 6 cores just curious?

1. Capcom is a tier 1 developer. I would take there word over literally anybody on this entire website. Even one of there developers went onto beyond3d and broke it down and went as far as to say it equals a single core i7 920 at same clock (single core i7 920 at similar clock surprisingly puts it at pentium D 840 performance and surpass any CPU at the time below $200). Xenon was above decent, for it's time it was superior to both the gen before it and the gen after it.

2. In relation to the previous two gens it was mediocre at best and kind of a let down. 7850 at very best was okay and 7790 was 100% lowend.

3. There gonna focus on up scaled 1440p and 4k so CPU won't be nearly as big of an issue. Most of the developers don't necessary like the jaguar CPU but they like the core ratio (1-2 cores dedicated to OS/background apps and the other remaining cores focus solely on gaming). I could 100% seeing them doing this again, not only do the developers like it but it makes 100% sense with lower CPU overhead needed and a monster increase in IPC from last gen

Avatar image for mastershake575
mastershake575

8574

Forum Posts

0

Wiki Points

24

Followers

Reviews: 0

User Lists: 0

#6 mastershake575
Member since 2007 • 8574 Posts

@crimson_v said:

1.and i understand why you'd think it was a decent CPU for its time

2. if i remember correctly the the PS4s GPU performed between a 7850 and a 7870 that cost about 150$ around the PS4s launch, so it was good enough GPU in my opinion, definitely not trash. (xbox one was obviously slightly worse than this i'm aware)

3. my main point was that the CPUs won't be that far behind this gen, i mean the bobcat CPUs in the consoles were MANY TIMES slower than at the time current haswell CPUs, sure in 1-2 years mid/high end CPUs might be 1.4-1.8 times better (in both quadc and multic performance), but console CPUs wont be as far behind as previous 2 gens.

1. Not to sound arrogant but that statement wasn't me pulling random garbage out of a hat. I was extensively following hardware at the time. Even capcom themselves said it was comparable to a high clocked pentium D 840. All the high clocked pentium D's where over $200 at the time, the mainstream athlon x2s weren't even out yet (cheapest one out at the time was the x2 3800+ which was over $350) and the single core Athlon's that could beat it (3500 series and above) where all above $250 at the time.......

The Xenon being decent at the time is more of a understatement if anything

2. PS4 was a slighlty downclocked 7850 and the xboxone was comprable to a budget/entry level 7790. If the consoles would of come out in 2012 it would of been decent but it releasing in 2013 really took a damper. 2013 the new mid tier cards (280, 280x, gtx 760) tanked the value of the gtx 660ti, 7950, and 7870xt.

It basically made all 3 of those cards (all of which 2x faster than the consoles) lower mid tier pricing wise (getting smoked by lower mid tier hardware is semi-embarassing, especially in comparison to the previous gen of consoles)

3. I agree that it won't be as far behind (current gen consoles are freaking slower clock for clock then the phenom series from 2009) but it's still going to be at very best lower mid tier by 2018 standards. Reality is it's gonna be at least a 1ghz and 8 threard disadvantage versus a 2017 1st gen ryzen CPU which I can literally go across the street and buy for low mid tier pricing right now ($129 at my local microcenter).

This is with 2 more generations of CPU's being released from now till then as well........

Avatar image for mastershake575
mastershake575

8574

Forum Posts

0

Wiki Points

24

Followers

Reviews: 0

User Lists: 0

#7 mastershake575
Member since 2007 • 8574 Posts
@crimson_v said:

We haven't had a console release that could match mid tier PC hardware since the original Xbox release, the PS4 and xbox one had decent GPUs but shitty CPUs from the Bobcat family with half the clock speed and IPC of CPUs of that time period, resulting in them not being able to push 60fps in most games regardless of settings, same thing happened with the PS3 and XBOX360, their CPUs were hot garbage fire with in-order execution and long pipelines both cell and xenon sacrificed IPC for clockspeed causing them to run hot while not delivering much performance it was basically a repeat of intels netburst but worse, cell even sacrificed die space for SPEs which couldn't even handle the task they were supposed to on their own and they still had to use a dGPU, meaning that despite these consoles having great GPUs (xbox360 at the time of the release had one of the best GPUs out there) they struggled to push 60fps in any game.

on the other hand both the PS5 and XBOX Scarlett will have 6-8 zen2 cores with great IPC running at 2.3-3.0 GHz which will definitely be able to finally push 60fps, on the CPU side the only bottleneck besides the clock speed will be the cost cutting measure where they use vram as system ram which adds some hefty latency but despite all that the CPU will manage to push 60fps on a large number of demanding games, GPU will be decent no doubt, and there's even talk about them using SSD's finally.

In my opinion the only real worry is the price, in terms of quality this will be the best time period to game on a console since the OG xbox, your thoughts?

Interesting thread. Just a few things I wanted to note

1. The xbox360 at launch actually was actually pretty decent for its time. GPU was equavlent to a $400 card at the time and it had unified shaders which the PC didn't even get until the next year. Xenon wasn't highend but reality was in Fall of 2005 there wasn't a desktop CPU faster than it for under $200. You could actually argue that the 360 era will be the very LAST time that we ever see harware that highend in a console.

2. I agree about the PS4/Xboxone using shitty CPU's but disagree about them using decent GPU's (there GPU"s where actually trash compared to the desktop lineup at the time). Literally the week the consoles came out I bought the Tahiti version of the 7870 (7870XT) for $135 and on stock volts it offered 2x the performance of the consoles GPU. Last generation 360 offered near $400 GPU performance, this generation xboxone/ps4 literally offered half the performance of a $135 card....... (that's embarassing).

3. Reality is the PS5 and Xbox Scarlett will be at very best mid tier by 2018 standards. They wont' come out till almost 2020 so ALOT is going to change. Both AMD and Geforce will have 2 more generations of mid tier cards by then. CPU wise I can go to my local microcenter and buy an 8 core 16 thread 3.8ghz 1700x for lower mid tier prices TODAY and that CPU will be faster than the PS5. When the consoles come out in 1.5 years there CPU will most likely be closer to "lowend" then "midtier"

Avatar image for mastershake575
mastershake575

8574

Forum Posts

0

Wiki Points

24

Followers

Reviews: 0

User Lists: 0

#8 mastershake575
Member since 2007 • 8574 Posts

Have to wait for reviews but I thought the all core out of the box speeds would be a little faster.

Not a super big deal since I didn't think the IPC increase would be that large + it will still undercut Intel prices significantly

Avatar image for mastershake575
mastershake575

8574

Forum Posts

0

Wiki Points

24

Followers

Reviews: 0

User Lists: 0

#9  Edited By mastershake575
Member since 2007 • 8574 Posts

@horgen said:

I'm hoping for 4.5GHz. I don't know who started the rumour about hitting 5GHz. I will be very surprised if it does.

Alot of these rumors have been getting too antsy.

Reality is if these chips hit all core boost of 4.4-4.6ghz with a 5-7% increase in IPC and undercut Intel prices (which the previous two lines did) then it will be a best seller, especially since the rumor is each product line getting at least a 2 core increase from previous gen

Avatar image for mastershake575
mastershake575

8574

Forum Posts

0

Wiki Points

24

Followers

Reviews: 0

User Lists: 0

#10  Edited By mastershake575
Member since 2007 • 8574 Posts

Clockrates, number of cores, and price are incredible (rumored prices seem too good to be true to be honest).

Add on top of that at least a 5% increase in IPC and you have a must buy product.