Anyone concerned that 2gb Vram will become obsolete once PS4 and Xbox One launch?

I'll probably upgrade to a 770 or 760 ti in the next couple months but I'm starting to worry that 2gb vram might become extremely limited for next gen games.

The PS4 gpu will have access to up to 7gb DDR5 and the Xbox One will have access to 5gb DDR3. Now of course the gpu won't access all of that ram but whose to say that new games won't start using 3-4gb vram?

I might wait and see if there will be a 4gb version of the 770.

*Stares at his crossfire HD 7970s* Nope :)

While it's true the next gen consoles are using more RAM/VRAM, it will be a few years before RAM/VRAM/GNC will impact us.

From a 2560x1600 gamer, the only game that maxed out my VRAM was Skyrim +mods. The only game that came close that didn't use any modification was Battlefield 3 on the 64 player map with 2.24 GB of VRAM. I just want to see someone with a titan max out Skyrim with all the mods to see if he/she can surpass the 4GB of VRAM :)

But don't worry about it. Nvidia Maxwell is around the the corner, and when it comes out AMD/ Nvidia will definitely have an answer for PS4 and Xbox one.
 
I am another in the same camp as Parja.

I am confused by this whole question, since when does VRAM directly result in massive performance gains? Generally there are very meager gains going from 2GB to 4GB VRAM (excluding the crazy custom texture mods for certain games) and I think assuming either XBOne or PS4 are going to target high res gaming (not just 4k vid playback) is ignoring the fact that they are effectively ~7850 level hardware in terms of GCN counts. While it is true there are benefits from the low-level hardware access from the consoles, generally a lot of the benefit is in terms of lower latencies for texture accesses meaning you can actually get by with LESS due to the reduced penalty to update the texture buffer more often.

There is no magic that is going to work around the architecture to magically provide 4K gaming and nothing currently available suggests these "next gen" consoles are going to surpass top PCs so why worry about future proofing against something already far below a GTX Titan?
 
I would assume the XBone will have to use the 32MB SRAM for GPU buffers if they want any performance at all, so that's quite the limited memory there. The DDR3 is a bit faster than the average PC I guess, but...

The PS4 is different, but lower common denominator says for games reaching the PC, the XBone will probably be the LCD in most cases.

Don't get me wrong, as I said earlier, I'm all for more memory. I want 4GB in my next card.

Remember, the days of only using video mem for textures and frame buffer are behind us, so trying to deduce use only from resolution is misguided, IMO.
 
its not the Vram you should be worried about, its the system ram, those who only got 2-4gb of system ram will need at least 8gb RAM when next-gen console ports start pouring in.


I don't see Ps4 and Xdone games using a lot of Vram, maybe 2gb at most, but they will be filled to the max with other game data.
 
Last edited:
its not the Vram you should be worried about, its the system ram, those who only got 2-4gb of system ram will need at least 8gb RAM when next-gen console ports start pouring in.


I don't see Ps4 and Xdone games using a lot of Vram, maybe 2gb at most, but they will be filled to the max with other game data.

System ram shouldn't be a worry at all. Easily and cheaply upgradable should you need more. VRAM on the other hand, not so cheap.
 
Anyone concerned that 2gb Vram will become obsolete once PS4 and Xbox One launch?

Not when you consider that the RAM in the XBox and the PS4 represents entire system RAM, while PCs have had 8-16GB system RAM for years now.

Why does everyone assume so much of the RAM is going to the GPU?
 
Consoles are going to be locked at 1080p for a few more years, because of TVs.

The only real way for developers to make their games better in that situation is to keep upping the texture quality since they have the a lot of RAM to work with it'll most probably happen.

So there's a chance consoles will surpass the PC (for a short time) since the average PC doesn't have much more than 1 GB of video RAM. Of course that all depends on the consoles being able to handle the much higher texture detail, processing wise.
 
isnt the ps4/xbox1 console memory which is shared between cpu and gpu, GDDR5 memory?
 
I'm not worried, remember the memory in the consoles is shared, so a lot of it is going to be running the OS, That share service on the PS4, the kinect, partys, voice, etc. etc. It's not ALL going into graphics.

Bingo. This.


You'll be fine for a while.
 
isnt the ps4/xbox1 console memory which is shared between cpu and gpu, GDDR5 memory?

Not sure how much is rumor and how much is genuine, but the Xbox One will use 8 gigs of DDR3, while the PS4 is 8 gigs of DDR5. Current console ports (which, lets be honest is most of the big titles right now) are designed to run well on the least common denominator hardware, x1950 level graphics hardware with 512 megabytes of shared memory. Now we have roughly 7790/7850ish graphics hardware with 16x the memory, so yeah assuming a game is designed for the console with those specs a 2-3 gig card is gonna be holding things back one way or the other.
 
To use more than 2GB of memory, you would need to be using ridiculously high resolution textures (~4k+) and high levels of anti-aliasing (like supersampling AA).

Or many more textures
Or higher resolution geometry
Or higher resolution shadow maps
Or larger game worlds
Or a huge reduction of load times as you can cache a lot of things
Or better animations
etc.
 
Both consoles are supposed to ship with 8GB.
The PS4 has 7 free.
The Xbox One has 5 free.

Yes, I expect that by the end of 2014 or mid 2015, more vram will be desperately needed on the PC side.
 
XBone GPU: 12 compute units providing a total of 768 threads, 800MHz, 1.2 teraflops

what-me-worry-715605.jpg
 
As soon as you hit the max on your vram you will stutter and lag beyond playability. Won't be enjoyable. People act like games are going to be made with old hardware still. The hardware of the new consoles is years ahead of the old consoles. Running 7850 video card on a console is not the same as running it on a pc. They are going to use the extra memory if the console has it. Caching and other things will fill it up. People with 2GB cards will be obsolete as soon as these new games come out. 3GB might be almost maxed or already maxed. 4GB might be the norm for awhile until they really crank it up probably around 5-6GB or even 7GB in ps4 later on. Consoles right now are maxed out. They are maxing out the hardware right now on those older consoles.

If they are maxing it out the consoles right now then just think when the new consoles are maxed out. You will need similar vram capacity.
 
As soon as you hit the max on your vram you will stutter and lag beyond playability. Won't be enjoyable. People act like games are going to be made with old hardware still. The hardware of the new consoles is years ahead of the old consoles. Running 7850 video card on a console is not the same as running it on a pc. They are going to use the extra memory if the console has it. Caching and other things will fill it up. People with 2GB cards will be obsolete as soon as these new games come out. 3GB might be almost maxed or already maxed. 4GB might be the norm for awhile until they really crank it up probably around 5-6GB or even 7GB in ps4 later on. Consoles right now are maxed out. They are maxing out the hardware right now on those older consoles. [EDIT] This goes hand-in-hand with PC's being vastly more powerful than current consoles, yet still run like shit because lack thereof optimizations, not inferior hardware. [/EDIT]

If they are maxing it out the consoles right now then just think when the new consoles are maxed out. You will need similar vram capacity.


Except games are not going to start coming out that take FULL advantage of the hardware for at least the first year, maybe even two, which gives people until roughly 2015. By then Nvidia's Volta stacked DRAM GPU's will start making there rounds promising insanely high memory bandwidth and higher capacities than we can even imagine now. People don't even know how the standard of DDR4 plans to forcefully launch capacities to levels we haven't see even today. People forget Gears of War stressed what everyone thought was the limits of the 360 when it dropped, yet each version they managed to squeak more and make it look vastly more incredibly than the last. Optimizations are key, not raw power. That is what makes consoles just as decent as some PC's with less hardware.

This is the nature of how things work. When the X-Box 360/PS4 came out with Unified Shaders the entire PC world shook in their boots, then came the 8800 Series and the 9800 Series and by the GTX 200 Series we were yawning. That all happened roughly 3 years after the drop of those consoles. As it is right now 2GB is becoming the norm and that will only get bigger. As far as RAM usage goes do people not realize there are settings on games lol? They wont drop a console game and then insta-port it to the PC without making some changes to the textures if they do start shipping with higher standards.

The fear mongering is absurd. You'd think it was Y2K again. These specs were pretty much confirmed nearly a year ago and people are just now starting to freak out?
 
Last edited:
As soon as you hit the max on your vram you will stutter and lag beyond playability. Won't be enjoyable. People act like games are going to be made with old hardware still. The hardware of the new consoles is years ahead of the old consoles. Running 7850 video card on a console is not the same as running it on a pc. They are going to use the extra memory if the console has it. Caching and other things will fill it up. People with 2GB cards will be obsolete as soon as these new games come out. 3GB might be almost maxed or already maxed. 4GB might be the norm for awhile until they really crank it up probably around 5-6GB or even 7GB in ps4 later on. Consoles right now are maxed out. They are maxing out the hardware right now on those older consoles.

If they are maxing it out the consoles right now then just think when the new consoles are maxed out. You will need similar vram capacity.

Considering they didnt fully exploit PS3 until Uncharted 2 came out, its not likely to be an issue for a while. On top of that, the first couple batches of games will still have PS3 and 360 in mind, and before we even take that timeline into consideration, its widely speculated the console themselves aren't coming until the holiday season.

Most PC gamers won't be running the same hardware today that they will be by the time 2gb VRAM may be a limiting factor @ 1080p. In other words, much ado about nothing.

Oh, and you're dreaming or just dont understand what RAM is if you think games will use 7gb of VRAM on PS4, ever. It's as if people see ddr5 and forget about other things. PS4 as 8 GB of unified RAM, not VRAM. There's the OS, and the game itself thats going to need ram in addition to graphical data. If you think that the combination of the OS and game are going to use a mere 1gb and leave 7GB of ram for the frame buffer, prepare to be disappointed.
 
Last edited:
2GB vram is already borderline, just tone down the AA & texture quality... not very [H]ard, but still workable.
 
2GB vram is already borderline, just tone down the AA & texture quality... not very [H]ard, but still workable.
as of right now its not borderline at all for the cards that have it as they are not playable at settings that need that much vram.
 
Wait and see. It will be interesting to see what kind of "ports" we will be getting. You would think we would benefit from the new consoles using mid range pc parts
 
The developers of Killzone Shadow Fall (PS4 Launch title) released a post-mortem of the demo, it used up 4.5GB of RAM, ~1.3GB on textures alone.
 
The developers of Killzone Shadow Fall (PS4 Launch title) released a post-mortem of the demo, it used up 4.5GB of RAM, ~1.3GB on textures alone.



Again though. That hearkens back to what I've already mentioned about Console Exclusive Titles probably being the only thing that'll push the boundaries in the immediate future because they only have to worry about one specific platform and thus hardware. Killzone Shadow Fall will have no effect whatsoever for other games on the PS4/X-Box ONE or PC because of it.

Most developers already have to balance their titles between both PS4 and X-Box ONE. As it stands right now I think their biggest concern is the lack (33%) of shaders between both platforms. That's probably going to lead to crappier cross-platform titles than could be possible, or some really badass console exclusives.
 
Considering they didnt fully exploit PS3 until Uncharted 2 came out, its not likely to be an issue for a while.

Except that now they are working with x86, not cell. There will be almost zero ramp time to spectacular games. I'm getting a PS4. MS really dropped the ball.
 
I'll wait until the E3 show next week to decide which console to get.
 
Good article on Eurogamer about future proofing for PS4,

http://www.eurogamer.net/articles/digitalfoundry-future-proofing-your-pc-for-next-gen

In terms of pure processing power, the chances are that we now have the horsepower to exceed the first and second generation games seen on next-gen console. But what still isn't addressed to a satisfying degree is the question of on-board video RAM. Both Microsoft and Sony machines use 8GB of RAM with fast access to the GPU. We're currently living in a world where even a £400 GeForce GTX 680 only ships with 2GB - and that's a worry.

"I think we can assume that most games will use a majority of the 8GB for graphics resources, so I'd go for as much GDDR5 on the GPU as possible," says Avalanche's Linus Blomberg.

"For the CPU I'd say at least 8GB DDR3, depending on how much stuff you'll have running in the background. But this is a tricky one! In Avalanche Studios' upcoming titles we'll use a lot of tricks that take advantage of the unified memory layout. But on high-end GPUs there will be ways of compensating for that, to some extent at least."

The quote is from a dev working at Avalanche, i.e. Just Cause 2.

And again, similar to the CPU recommendations, we see consensus from all of our sources on how to best future-proof your PC in this respect - buy a graphics card "with as much memory as you can afford".
 
Last edited:
bf4 and cod will still be coming for pc but expect COD to be only on Xbox One, it might be a while before COD reaches PS4 and PC, MS is planning to spend $1 billion dollars for Timed exclusives and buying out some developers.

A lot of games might not come to PC at all, so yeah, M$ is one money hungry long nosed criminals.

M$ lacks 1st party developers, so this is their strategy, to monopolize gaming.

it makes me sick to my stomach actually.

I don't really think this is true the beating MS is taking in the desktop OS world with apple and failure to get into mobile suggests they will be doing more to encourage people to use the OS, trying to block games from making it to the PC will be suicide. A ton of PC users ONLY use windows because of games and if not for games they would skip to macs or Linux.

As to the OP, it could be an issue, but if you are really worried about optimized games why aren't you just plain saying should I stay away from NVidia? For a while now AMD has offered +50% Vram on lots of high end GPUs and on top of that it would seem likely that future games will also be optimized for these GPUs and not NVidias right? And all of that will simply not matter if lots of ports are just crappy anyway, might not matter how much VRAM or which camp you are in.
 
Last edited:
I don't really think this is true the beating MS is taking in the desktop OS world with apple and failure to get into mobile suggests they will be doing more to encourage people to use the OS, trying to block games from making it to the PC will be suicide. A ton of PC users ONLY use windows because of games and if not for games they would skip to macs or Linux.

I'd be willing to bet the number of PC users in this category is small enough for MS not to even care about, actually.
 
Are kidding me? MS might actually try to block a lot of console games from running on Windows? This would be an all time low for them if they do. If they did I guess their thinking would be "let's try to force pc gamers to buy an Xbox One". I for one would go PS4 exclusively if that happened.

Figures keep floating in though that Microsoft is spending $1 billion (Yes, billion with a b) on 15-exclusives for the xbox one launch or near-launch lineup (7 of which are new franchises). If they hope to re-coup that billion in sales, they will need to move an awful lot of units and will have a lot more trouble convincing ppl to buy an xbox one if all the xbox one 'exclusives' are available on PC.

I do expect a lot of paid excluives to be xbox one only and not 'console-only' but excluding PC as well. I also expect Razor1911 and whomever's response to be to 'crack' xbox one games designed to run on x64 hardware found off-the-shelf to work on PCs.
 
I really, really doubt the PS4 or Xbox1 will use all it's RAM as VRAM. 1080P simply isn't that demanding. Not to mention their memory is shared. Computers have upgradable memory. I'm rocking 16GB on my rig... granted it's system memory and only DDR3, it wont hold anything back.
 
I'd be willing to bet the number of PC users in this category is small enough for MS not to even care about, actually.

I think that number is higher than you think. Their are a lot of PC gamers whom use Windows to game on.

Guess we'll find out for sure once Valve releases their Linux powered hardware. ;)
 
i believe 1gb will become obsolete when cross=platform next-gen titles start arriving.

and anyone spending more than £150 (AMD7870/Nvidia660) ought to be considering a 3GB card if they are buying this summer.
 
MS really dropped the ball.

I hear people say this all the time and think 'how?'. They have already annoucned spending 1 billion dollars on 15 exclusives including 7 new franchises and then a bunch of commenters come out with 'XBOX DOESNT CARE ABOUT GAMES' etc. 'Xbox dropped the ball on games'.

I am still a bit unsure about hardware though. I do believe that PS4's hardware seems to have an edge but xbox 360's hardware(graphics card) had an edge over the ps3 giving most titles that were multiplatform slightly better graphics on xbox 360 with more AA, clearer textures, faster media read speed, etc.

That didn't stop most PS3 hardcore fans from refusing to buy an Xbox 360 b/c of their 'exclusives' they felt were 'magical' ala 'uncharted! r:fom! FF and rpg stuff!'. Not to mention, part of PS3's goal was to push blu-ray as the next 'dvd replacement' format over Toshiba's HD-DVD. Interestingly, nobody from the PS3-camp really 'harked' at Sony's blu-ray player in the ps3 being slower than the dvd player in the Xbox 360 and having to have data-redundancy of discs in order to speed up read times (ie halfing the effective size or worse). People seem to forget that the PS3 had a HUGE media focus.

It seems, that Microsoft and Sony have almost slightly reversed circumstances where Microsoft is going for slightly weaker hardware but far more exclusive game content and a bit of multimedia focus. Strangely, that focus last-generation didn't kill the PS3 at all and sales wise both consoles are quite competitive. I don't expect that Xbox one to crash and burn. I do think the Xbox one reveal should have had a lot more games focus than leaving everything for E3 or the reveal should have been delayed a month closer to e3. Once gamers see a focus on games from both consoles, they'll probably flock to whichever console seems to have the most games or the most games fitting their playing style.
 
I think that number is higher than you think. Their are a lot of PC gamers whom use Windows to game on.

Guess we'll find out for sure once Valve releases their Linux powered hardware. ;)

I'd say just about every PC gamer uses windows, that wasnt even my argument. I'm saying the number of users that would go to Linux save for gaming is quite small
 
"And again, similar to the CPU recommendations, we see consensus from all of our sources on how to best future-proof your PC in this respect - buy a graphics card "with as much memory as you can afford".

Except that's forgetting the fact that future-proofing is a stupid waste of money. If you want to know how to do something stupid, listen to this guy. But its better to avoid doing something stupid in the first place.

Someone who can afford a $500 video card today, in order to get that precious 4GB of vram instead of the standard 2GB, can more easily afford to get a $400 video card with 2GB of vram today, and get a $400 video card in 2 years when games that can actually use more than 2 GB of vram actually exist. Because in 2 years the standard video card will come with 4-6GB of vram anyways.

On the other hand, are you seriously going to recommend someone who can't afford it get "as much memory as you can afford" when the most affordable option with 4GB of ram is over $400? Its just fucking retarded.

The guy you quoted is giving idiotically poor advice. Everyone here would be better off getting whatever 2GB video card they can afford now, and getting whatever standard-size-memory video card they can afford when they upgrade again in 2 years. The standard-size-memory will be enough. Trying to future-proof your video memory now, as if you're going to have the same video card for 5 more years, is a stupid waste of money, because you're not going to have the same video card for 5 more years.
 
...you're not going to have the same video card for 5 more years.

Interesting. Most of last month was spent sending defective 460x2wins back to EVGA (they finally got me a working 660ti) but I was still able to game at reasonably high settings at 1920x1200 with my trusty *gasp* GTX280 (2008).
 
Interesting. Most of last month was spent sending defective 460x2wins back to EVGA (they finally got me a working 660ti) but I was still able to game at reasonably high settings at 1920x1200 with my trusty *gasp* GTX280 (2008).

You proved his point. You used the card in a pinch, you haven't used it as your primary gaming card for 5 years. He didnt say most people can't save old hardware for 5 years. I have a HD4850 that can also run games at 1200p that doesn't mean I use it as a primary card.
 
Back
Top