Do you think next gen will look like this?

  • 72 results
  • 1
  • 2
Avatar image for rzxv04
#51 Posted by rzxv04 (237 posts) -

@pc_rocks said:
@vfighter said:

@HalcyonScarlet: Yeah...you're kinda way off.

How?

PS4 - the most powerful console in 2013 was an equivalent of 570 from 2010

PS4 Pro - released in 2016 was still weaker than 970 from 2014

X1X - released in 2017 was barely faster than 970 from 2014 and weaker than 1060

Any console coming in 2020 will match 1070Ti at best regardless of how many AMD's TFLOPS it has and the history will again repeat it self.

On topic, not good. I liked when character graphics were 'gamey' the more realistic they are trying to get the more 'bad' it looks and feel may be because of the uncanny valley.

Reminds me of how ugly I found 3D PS One games even when I was younger. Loved games like Megaman X4, MM8, Puzzle Bobble, Pocket Figher, MVC, sprite based.

I think Omega Boost and MML2 were the only full PS 3D game that looked good to me when I was younger.

PS2 got me pretty content with 3d models though.

Yeah probably that uncanny valley. Weird though, I probably got that off my system upon appreciating how hard it is for real time rendering. I only notice again when people bring it up.

Avatar image for PAL360
#52 Edited by PAL360 (29286 posts) -

@PSP107 said:

@PAL360: "Nostalgia, the usual suspect."

Nostalgia I think not. Games back then concentrated more on gameplay design. These days it seems publishers/developers are competing for best motion picture.

But if you're going to bring up nostalgia, I"ll argue you're just caught in the moment. I mean this comment made no sense

"Nothing better than play games at 6fps, like Perfect Dark, Donkey Kong Country "

Personally I'll still take those games over today's games but you're judging old games based on current standards. How did you feel when you 1st played those games?

GTA San Anderas (2004) was a classic back then but wouldn't get high praise if reviewed today.

Of course i loved those games back in the day...and still do. In fact some of my favourite games of all time are from the 16bit era. The thing is that i'm aware that nostalgia, and my personal experience as a kid and young teen, have alot to do with it.

'Games back then concentrated more on gameplay design' Hmmm no. Many praised games from the past wouldn't even pass as final products now, mostly because of dated design formulas. Do you think that a game like Virtua Racing, with one car and one track for 100 dolars would be accepted by current standards? Or the PS2 version of shadow of the Collossus running at 15fps? Or Fatal Fury, a fighting game with 3 playable characters? Or PS2 GTA games with their lock on gunplay and terrible hit detection? Or the whole 'lives and credits' formula (created for arcades to eat coins) that attempted to make 8 and 16bit games look longer and more difficult?

There were always good and bad games, but the overall quality is much higher now. Games are more polished and better designed now, it's not even close!

Avatar image for GarGx1
#53 Posted by GarGx1 (10410 posts) -
@HalcyonScarlet said:
@vfighter said:

@HalcyonScarlet: Yeah...you're kinda way off.

@blueberry_bandit said:
@HalcyonScarlet said:

No, it'll probably look a bit better than we have now. If the Xbone is like 1060, maybe next gen will be like a 1070.

No chance. Sony / Microsoft will not release a console with that small of a difference.

Yeah, I'm just not holding my breath any more. If you think it's going to be massive, you may be in for a surprise. Especially since they'll be aiming for around £350.

After 8 years of the Xbox 360 and PS3, the Xbone and PS4 is what we got. With the PS4 basically matching the GTX 750 Ti.

We'll see whose right, hopefully you guys are. The XXbone was a mid gen upgrade and MS could afford to sell that at £430+. But they launched the original Xbone at £430 and got spanked, so neither MS or Sony will go there.

AMD's Navi GPU is expected to be approximately GTX 1080 levels at the top end. This is also the most likely chipset for the next console release. Whether they have the top end of Navi (hmmm at what cost?) or something less of course remains to be seen.

Avatar image for pc_rocks
#54 Posted by PC_Rocks (1305 posts) -

@uninspiredcup said:

Don't want photo-realistic faces, can just look out the window. Games are more comparable to cartoons.

Look at the differences between the classic DMC games and that monstrosity Capcom revealed.

Real people are ugly bastards, playing video games gets you away from being forced to look at them.

That too!

Avatar image for PSP107
#55 Posted by PSP107 (17257 posts) -

@PAL360: "Do you think that a game like Virtua Racing, with one car and one track for 100 dolars would be accepted by current standards?"

lol, why are you judging based on current standards? Anyway, its funny you went that route even though we have been in the dlc/mt era.

"Or the PS2 version of shadow of the Collossus running at 15fps? Or Fatal Fury, a fighting game with 3 playable characters? Or PS2 GTA games with their lock on gunplay and terrible hit detection?"

lol, you just literally judged old games based on current standards.

Avatar image for PAL360
#56 Edited by PAL360 (29286 posts) -

@PSP107: Or course we do, and should, compare games by the same standards. I could try to rate 8th gen games by 1995 standards, but i would be forced to give many modern games scores of 50/10.

Like you said, standards are higher now, which is why games in general are much more polished.

'Anyway, its funny you went that route even though we have been in the dlc/mt era.'

Well, that's because you probably think mainstream AAA games are all we have available. Missing the best this gen has to offer is entirely your fault, buddy.

Avatar image for PSP107
#57 Posted by PSP107 (17257 posts) -

@PAL360: "Or course we do, and should, compare games by the same standards."

Your original 2004 opinion of GTA San Andreas should be your actual final judgement.

MGS1/2/3 were highly praised back in, but it's camera/combat would be highly ripped if judged today.

Avatar image for Jag85
#58 Posted by Jag85 (12224 posts) -

Is that real-time or pre-rendered?

Avatar image for ajstyles
#59 Posted by AJStyles (167 posts) -

@PSP107:

MGS1: Yes. It deserves critcism for the controls/camera. It’s still playable, but aged. And no matter what haters say, The Twin Snakes is superior.

MGS2 is still playable and the controls are fine. Kojima had smart design for those games. MGS2 is spent mostly in tight corridors with a fixed camera. The level/areas are small. The controls are tight.

MGS3’s levels are bit more open, but again, it controls fine especially the subsistence version.

I know this because I rebeat these games every year. I don’t find them jarring to play like PS2 GTA games which are unplayable to me now.

I downloaded the GTA collection on PS4 and I can’t believe how awful the controls and camera are. I don’t understand how I liked it 14+ years ago.

Avatar image for PSP107
#60 Posted by PSP107 (17257 posts) -

@ajstyles: "I don’t understand how I liked it 14+ years ago."

Your original thoughts should be your actual judgment.

Avatar image for PAL360
#61 Edited by PAL360 (29286 posts) -

@ajstyles: 'I downloaded the GTA collection on PS4 and I can’t believe how awful the controls and camera are. I don’t understand how I liked it 14+ years ago.'

Because standards change over time. I find hard to enjoy most 3d games i used to love (2d pixelart, on the other hand, age much better), and it's not because of graphics, that wouldn't be fair, but because of terrible gameplay mechanics

That's why i can't take seriously people who say stuff like 'old GTA games are better than new ones'. That is blind nostalgia at its best.

Avatar image for XVision84
#62 Posted by XVision84 (14256 posts) -

I'm guessing it'll be 2 gens after, but only time will tell. It's hard to say where technology will lead since the future is filled with surprises.

The Last of Us 2 footage blew me away in terms of its animation and graphical fidelity. I did not think we would reach this point in the PS4/Xbox One era.

Avatar image for robert_sparkes
#63 Edited by robert_sparkes (1499 posts) -

I remember GTA 3 on ps2 and being mesmorized by it. Playing It back then and then now is light years away.

Avatar image for Guy_Brohski
#64 Edited by Guy_Brohski (1856 posts) -

Expect that level of graphics in 2030 or 2035. The current trend in games isn't even about graphics, it's about competitive MP that runs very smoothly. PC gamers turn down settings so they can target 200+ fps.

Avatar image for johnwill03
#65 Posted by JohnWill03 (14 posts) -

lol

Avatar image for ronvalencia
#66 Posted by ronvalencia (26338 posts) -

@warmblur: The next Dead or Live game. :p

Avatar image for ronvalencia
#67 Edited by ronvalencia (26338 posts) -

@pc_rocks said:
@vfighter said:

@HalcyonScarlet: Yeah...you're kinda way off.

How?

PS4 - the most powerful console in 2013 was an equivalent of 570 from 2010

PS4 Pro - released in 2016 was still weaker than 970 from 2014

X1X - released in 2017 was barely faster than 970 from 2014 and weaker than 1060

Any console coming in 2020 will match 1070Ti at best regardless of how many AMD's TFLOPS it has and the history will again repeat it self.

On topic, not good. I liked when character graphics were 'gamey' the more realistic they are trying to get the more 'bad' it looks and feel may be because of the uncanny valley.

At 4K, X1X beaten both RX-580 and GTX 1060 in Far Cry 5

At 4K, X1X beaten both RX-580 and GTX 1060 in Forza Motorsport 7

At 4K, X1X beaten GTX 1060 and rivaled GTX 980 Ti / GTX 1070 in Killer Instinct Season 3

At 4K, X1X rivaled GTX 980 Ti / GTX 1070 in Gears of War 4

For raster graphics, TFLOPS is nothing without rasterization, ROPS read/write and ROPS fix function hardware.

Loading Video...

The above example shows lower TFLOPS Vega 56 with 12.2 TFLOPS at 1710 Mhz clock speed beating Strix Vega 64 OC with 13.03 TFLOPS at 1590 Mhz.

Pure compute workload doesn't use rasterization, ROPS read/write and ROPS fix function hardware hence TMU read/write units was able to expose AMD's TFLOPS that rivals NVIDIA's TFLOPS. Both Vega 64 and GTX 1080 Ti has similar TMU numbers and TFLOPS, hence similar cryptocurrency results.

AMD is pushing for async compute/TMU read/write path instead, but ROPS can not be avoided.

Vega 56 at 1710 Mhz has higher rasterization, ROPS read/write and ROPS fix function hardware performance when compared to Strix Vega 64 OC at 1590 Mhz

Forza Horizon 4, Vega 64 LC at 1660 Mhz rivaled GTX 1080 Ti at 1440p

From http://www.techspot.com/review/1716-forza-horizon-4-gpu-benchmarks/

Vega 64 reach higher memory bandwidth bound issues at 4K when compared GTX 1080 Ti's results. There's a high probability Microsoft was heavily using TMU read/write path and software tile render via ROPS (manual micro-tile render bound by L2 cache size programming optimizations) for Forza Horizon 4.

Vega MI60 at 1800 Mhz would have nearly rivaled RTX 2080 Ti with Forza Horizon 4.

AMD's TFLOPS are not the problem i.e. the real problem is the inability for ROPS read/write path to expose high TFLOPS operations.

TFLOPS arguments doesn't include read/write units.

I would like to see Navi 12 XT with 40 CU at 1900 Mhz since at this clock speed, AMD's 64 ROPS would rival RTX 2070's 64 ROPS levels. Higher clock speed = lower time latency to completion.

Avatar image for steve5xg
#68 Posted by Steve5XG (85 posts) -

It looks interesting, but still doesn't look human. I mean, that girl has to blink her eyes a little bit more. And what is more important: that graphics are available for a few really big companies which can pay for them. I believe what developers wanna create is video games based in a funny gameplay.

Avatar image for HalcyonScarlet
#69 Posted by HalcyonScarlet (8208 posts) -

@ronvalencia said:
@pc_rocks said:
@vfighter said:

@HalcyonScarlet: Yeah...you're kinda way off.

How?

PS4 - the most powerful console in 2013 was an equivalent of 570 from 2010

PS4 Pro - released in 2016 was still weaker than 970 from 2014

X1X - released in 2017 was barely faster than 970 from 2014 and weaker than 1060

Any console coming in 2020 will match 1070Ti at best regardless of how many AMD's TFLOPS it has and the history will again repeat it self.

On topic, not good. I liked when character graphics were 'gamey' the more realistic they are trying to get the more 'bad' it looks and feel may be because of the uncanny valley.

At 4K, X1X beaten both RX-580 and GTX 1060 in Far Cry 5

At 4K, X1X beaten both RX-580 and GTX 1060 in Forza Motorsport 7

At 4K, X1X beaten GTX 1060 and rivaled GTX 980 Ti / GTX 1070 in Killer Instinct Season 3

At 4K, X1X rivaled GTX 980 Ti / GTX 1070 in Gears of War 4

I don't exactly trust MSs coding. Plus I heard in FM7, some weird CPU choices were made. http://www.dsogaming.com/news/turn10-confirms-that-forza-motorsport-7-intentionally-maxes-out-only-one-cpu-core/

The game stuttered a lot too.

Plus what settings were they on? And was the XXbone native 4K?

Avatar image for pc_rocks
#70 Posted by PC_Rocks (1305 posts) -

@HalcyonScarlet said:
@ronvalencia said:
@pc_rocks said:
@vfighter said:

@HalcyonScarlet: Yeah...you're kinda way off.

How?

PS4 - the most powerful console in 2013 was an equivalent of 570 from 2010

PS4 Pro - released in 2016 was still weaker than 970 from 2014

X1X - released in 2017 was barely faster than 970 from 2014 and weaker than 1060

Any console coming in 2020 will match 1070Ti at best regardless of how many AMD's TFLOPS it has and the history will again repeat it self.

On topic, not good. I liked when character graphics were 'gamey' the more realistic they are trying to get the more 'bad' it looks and feel may be because of the uncanny valley.

At 4K, X1X beaten both RX-580 and GTX 1060 in Far Cry 5

At 4K, X1X beaten both RX-580 and GTX 1060 in Forza Motorsport 7

At 4K, X1X beaten GTX 1060 and rivaled GTX 980 Ti / GTX 1070 in Killer Instinct Season 3

At 4K, X1X rivaled GTX 980 Ti / GTX 1070 in Gears of War 4

I don't exactly trust MSs coding. Plus I heard in FM7, some weird CPU choices were made. http://www.dsogaming.com/news/turn10-confirms-that-forza-motorsport-7-intentionally-maxes-out-only-one-cpu-core/

The game stuttered a lot too.

Plus what settings were they on? And was the XXbone native 4K?

Yup and for Forza Horizon 4 DF even said 1060 is more than enough to rival X1X for all modes.

Avatar image for ronvalencia
#71 Edited by ronvalencia (26338 posts) -

@HalcyonScarlet said:
@ronvalencia said:
@pc_rocks said:
@vfighter said:

@HalcyonScarlet: Yeah...you're kinda way off.

How?

PS4 - the most powerful console in 2013 was an equivalent of 570 from 2010

PS4 Pro - released in 2016 was still weaker than 970 from 2014

X1X - released in 2017 was barely faster than 970 from 2014 and weaker than 1060

Any console coming in 2020 will match 1070Ti at best regardless of how many AMD's TFLOPS it has and the history will again repeat it self.

On topic, not good. I liked when character graphics were 'gamey' the more realistic they are trying to get the more 'bad' it looks and feel may be because of the uncanny valley.

At 4K, X1X beaten both RX-580 and GTX 1060 in Far Cry 5

At 4K, X1X beaten both RX-580 and GTX 1060 in Forza Motorsport 7

At 4K, X1X beaten GTX 1060 and rivaled GTX 980 Ti / GTX 1070 in Killer Instinct Season 3

At 4K, X1X rivaled GTX 980 Ti / GTX 1070 in Gears of War 4

I don't exactly trust MSs coding. Plus I heard in FM7, some weird CPU choices were made. http://www.dsogaming.com/news/turn10-confirms-that-forza-motorsport-7-intentionally-maxes-out-only-one-cpu-core/

The game stuttered a lot too.

Plus what settings were they on? And was the XXbone native 4K?

At 4K, it's mostly GPU bound.

It's not the first time that R9-390X (5.9 TFLOPS) almost rivaled GTX 1070 (~6.5 TFLOPS) and matching GTX 980 Ti Ref.

It's well known DICE uses significant software tiled render via compute/TMU path and if the programmer is smart, the software render tile size would adapted to R9-390X's TMU read-write linked to 1MB L2 cache size.

X1X's version has larger 2MB L2 cache and 2MB render cache, hence giving programmer larger high speed boundaries before hitting large performance drop (high external memory hit rates).

Killer Instinct Season 3 and Resident Evil 7 are another strong games for R9-390X i.e. rivals GTX 980 Ti and GTX 1070.

For Battlefield V, R9-390X has higher frame rate vs TFLOPS ratio when compared Vega 56 and Vega 64 i.e. 12 TFLOPS needs 128 ROPS range instead of 64 ROPS bottleneck.

Incidentally, 7nm NAVI 12 has 40 CU which is close to R9-390X's 44 CU config..

Scale from R9-390X's 1050 Mhz to 1800 Mhz (e.g. 7nm Radeon Instinct MI60's clock speed) and 256 bit GDDR6-12000 + Vega 64 ROPS with 4MB L2 cache could land NAVI 12 on RTX 2070 range. R9-390X beaten unbalance R9-Fury X.

Scale from Fury X's 1050 Mhz to Vega 64's 1536 Mhz lands on 44 fps which is very close to the real Vega 56's 47 fps results. Something strange with HBM...

R9-390X with full Vega updates would render Vega 56 useless e.g. R9-390X with 1536 Mhz, Vega 64 ROPS with 4MB L2 cache, Polaris DCC and 'etc', but NAVI 12 seems to be update for Hawaii GCN.

Avatar image for ronvalencia
#72 Edited by ronvalencia (26338 posts) -

@pc_rocks said:
@HalcyonScarlet said:
@ronvalencia said:
@pc_rocks said:

How?

PS4 - the most powerful console in 2013 was an equivalent of 570 from 2010

PS4 Pro - released in 2016 was still weaker than 970 from 2014

X1X - released in 2017 was barely faster than 970 from 2014 and weaker than 1060

Any console coming in 2020 will match 1070Ti at best regardless of how many AMD's TFLOPS it has and the history will again repeat it self.

On topic, not good. I liked when character graphics were 'gamey' the more realistic they are trying to get the more 'bad' it looks and feel may be because of the uncanny valley.

At 4K, X1X beaten both RX-580 and GTX 1060 in Far Cry 5

At 4K, X1X beaten both RX-580 and GTX 1060 in Forza Motorsport 7

At 4K, X1X beaten GTX 1060 and rivaled GTX 980 Ti / GTX 1070 in Killer Instinct Season 3

At 4K, X1X rivaled GTX 980 Ti / GTX 1070 in Gears of War 4

I don't exactly trust MSs coding. Plus I heard in FM7, some weird CPU choices were made. http://www.dsogaming.com/news/turn10-confirms-that-forza-motorsport-7-intentionally-maxes-out-only-one-cpu-core/

The game stuttered a lot too.

Plus what settings were they on? And was the XXbone native 4K?

Yup and for Forza Horizon 4 DF even said 1060 is more than enough to rival X1X for all modes.

FH4 has tone down excessive alpha effects when compared to FM7. X1X FH4 at 4K doesn't have floating frame rate beyond 30 fps.