Possible RTX 3080 Ti FE?

Camberwell

Gawd
Joined
Jan 20, 2008
Messages
947
I haven't been able to verify this particular tweet, so please don't take it as gospel, but it would be interesting if this is indeed true:

RTX 3080 Ti FE:
PG133-SKU15,
GA102-250-KD-A1,
20GB GD6X,
he same FP32 count as 3090,
10496FP32, the same MEM speed and TGP as 3080,
no NVLINK.

Source:

https://twitter.com/kopite7kimi/status/1323785556417863680

Also mentioned on several tech websites, but still referencing the above tweet:

https://www.techradar.com/news/nvid...gpu-with-20gb-of-vram-to-take-on-amd-big-navi

https://wccftech.com/nvidia-geforce-rtx-3080-ti-20-gb-graphics-card-specs-leak/
 
It doesn't make any sense for Nvidia to release another card when the GPU performance difference between the 3080 and the 3090 is only 10-15% already. Anything above the 3080 would pretty much negate the reason for a 3090 to exist outside of additional VRAM for non-gaming workload advantages.
 
It doesn't make any sense for Nvidia to release another card when the GPU performance difference between the 3080 and the 3090 is only 10-15% already. Anything above the 3080 would pretty much negate the reason for a 3090 to exist outside of additional VRAM for non-gaming workload advantages.
In most games it's been < 10% on average, it'd be a really tight spot to fill, which has a big price gap..
 
  • Like
Reactions: DF-1
like this
In most games it's been < 10% on average, it'd be a really tight spot to fill, which has a big price gap..

Still doesn't make any sense to me, at least anytime soon. Maybe early next year they can do another "Super" refresh and can double the VRAM on current models for a modest price increase.
 
Still doesn't make any sense to me, at least anytime soon. Maybe early next year they can do another "Super" refresh and can double the VRAM on current models for a modest price increase.
They be panicking with the RDNA2 launch. If AMD's slides are to be believed, the 6800XT/6900XT are great competitors. If a 6900XT matches a 3090, Nvidia only has two choices. Release a 3080ti with more memory, or price cut on the 3090. I see the former being more realistic.
 
They be panicking with the RDNA2 launch. If AMD's slides are to be believed, the 6800XT/6900XT are great competitors. If a 6900XT matches a 3090, Nvidia only has two choices. Release a 3080ti with more memory, or price cut on the 3090. I see the former being more realistic.

I guess.. but the other issue is that they'll just be releasing another card that no one can buy for the foreseeable future. I'm still looking to replace my 2080, but I'm not going to bother using a bot to do it or care enough to pay over MSRP for a card.
 
Still doesn't make any sense to me, at least anytime soon. Maybe early next year they can do another "Super" refresh and can double the VRAM on current models for a modest price increase.

A year from now seems much more likely to me; but since I don't have - and don't anticipate having anytime soon - a 3080; I'm not going to lose sleep over it.

Previous rounds of the rumor mill were that multiple AIBs were making 20gb 3080 cards, followed by one that NVidia had issued a veto on releasing them.

What's effectively a 20GB 3090 seems even less likely unless they're having a lot of memory controller failures.
 
It doesn't make any sense for Nvidia to release another card when the GPU performance difference between the 3080 and the 3090 is only 10-15% already. Anything above the 3080 would pretty much negate the reason for a 3090 to exist outside of additional VRAM for non-gaming workload advantages.

But yet, that's what you're going to get. A card that performs very similarly to slightly slower than a 3090 with less memory and $500 cheaper. It's basically the same with every X080 Ti card released.
 
A year from now seems much more likely to me; but since I don't have - and don't anticipate having anytime soon - a 3080; I'm not going to lose sleep over it.

Previous rounds of the rumor mill were that multiple AIBs were making 20gb 3080 cards, followed by one that NVidia had issued a veto on releasing them.

That's not just rumor mill...there were actual marketing slides out in the wild. I think Nvidia canceled them because they don't want to invest in Samsung GPUs at this point, so they are going to respin Ampere on TSMC with higher clocks and double memory as soon as feasible (next summer?).

So roll with the punches until then, then have higher clocked cards with double memory, and then see how AMD responds.
 
Nvidia can leak and release as many different variations as they want. Won't mean a damn thing if they cant even release the cards to the public since for some reason they cant produce enough cards.

All they are trying to do it stop AMD's thunder with the incoming launches.
 
Nvidia can leak and release as many different variations as they want. Won't mean a damn thing if they cant even release the cards to the public since for some reason they cant produce enough cards.

All they are trying to do it stop AMD's thunder with the incoming launches.

Too late, AMD is here to stay!
 
I think NV is reacting to the new AMD cards that all have 16gb - even with slightly lower performance the larger space for textures, especially 4k textures, is going to be a big deal for people. This makes me suspect ,it is more about getting out a card with more ram then anything.

Same price point as the 6900, right?
 
Not possible, probable. Nvidia got their rep smeared with AMD and Games using more than 10GB for 4K. Only questions now are, how deep is your pocket? And how much are you willing to sacrifice to the Green Eyed Monster?
 
Not possible, probable. Nvidia got their rep smeared with AMD and Games using more than 10GB for 4K. Only questions now are, how deep is your pocket? And how much are you willing to sacrifice to the Green Eyed Monster?
Which games require more than 10GB at 4K? (and not just allocation, I'm talking about crashing out / huge performance dips while the pagefile is used for texture loading). Justice League with aftermarket texture pack and...?
 
Which games require more than 10GB at 4K? (and not just allocation, I'm talking about crashing out / huge performance dips while the pagefile is used for texture loading). Justice League with aftermarket texture pack and...?
Godfall announced that their game will use 12GB's at 4K just recently
 
Which games require more than 10GB at 4K? (and not just allocation, I'm talking about crashing out / huge performance dips while the pagefile is used for texture loading). Justice League with aftermarket texture pack and...?

I'm a 4k gamer. Godfall is the game I've been anticipating the most this year. They say 12GB potentially needed at highest textures. Sucks too because no way in hell I can land an AMD card in the next week. :(
 
Godfall announced that their game will use 12GB's at 4K just recently
I'm a 4k gamer. Godfall is the game I've been anticipating the most this year. They say 12GB potentially needed at highest textures. Sucks too because no way in hell I can land an AMD card in the next week. :(
Oh God! NO! Not TWO games! (one of which was coded with AMD's help and isn't released yet and hasn't been tested)

Man, setting textures down a notch with (likely) no discernable difference in quality is gonna REALLY suck for the plebs with 11GB of VRAM or less for those (maybe) two games!
 
Last edited:
Oh God! NO! Not TWO games! (one of which was coded with AMD's help and isn't released yet and hasn't been tested)

Man, setting textures down a notch with (likely) no discernable difference in quality is gonna REALLY suck for the plebs with 11GB of VRAM or less for those (maybe) two games!

Confused on what your point you're trying to make is here.
 
Confused on what your point you're trying to make is here.
The point is that there is 1 badly optimized game with an aftermarket texture pack that uses more than 10GB of VRAM, and reportedly another that was made with input from AMD that *might* need more than 10 as well with everything maxed at 4K.

With a setting that very likely doesn't noticeably impact visual quality by taking it down a notch (as seen over and over in in depth game reviews, the most recent being Watch Dogs Legions).

However I don't really care - I just laugh as games allocate (not need) more, and more, and more VRAM on my 3090 while people piss and moan, worrying about VRAM allocation (and not actual in-use VRAM).
 
Last edited:
The point is that there is 1 badly optimized game with an aftermarket texture pack that uses more than 10GB of VRAM, and reportedly another that was made with input from AMD that *might* need more than 10 as well with everything maxed at 4K.

With a setting that very likely doesn't noticeably impact visual quality by taking it down a notch (as seen over and over in in depth game reviews, the most recent being Watch Dogs Legions).

However I don't really care - I just laugh as games allocate (not need) more, and more, and more VRAM on my 3090 while people piss and moan, worrying about VRAM allocation (and not actual in-use VRAM).

I haven't seen anyone pissing and moaning...just wondering at the decision not to future proof their cards. Especially since we have an alternative that is substantial to drastically lower in price with near equal performance. Plus people look for whatever reason they can to justify/compare their purchases with video cards. From power draw to VRAM to physical size.
 
I haven't seen anyone pissing and moaning...just wondering at the decision not to future proof their cards. Especially since we have an alternative that is substantial to drastically lower in price with near equal performance. Plus people look for whatever reason they can to justify/compare their purchases with video cards. From power draw to VRAM to physical size.
Oh man, have I seen it a LOT. But I troll (in the good way) EVGA forums, and to a lesser extent the GPU forums here.

You wouldn't believe the amount of speculative bitching that people spout (and it's where I found out about that one game with the aftermarket texture pack that runs like shit at 4K with a 10 gig VRAM card). Or how people continually don't understand VRAM allocation vs. in-use.
 
Oh God! NO! Not TWO games! (one of which was coded with AMD's help and isn't released yet and hasn't been tested)

Man, setting textures down a notch with (likely) no discernable difference in quality is gonna REALLY suck for the plebs with 11GB of VRAM or less for those (maybe) two games!

You would have to be the ultimate fanboy to spend $699 on an Nvidia card with 10GB only to set textures down a notch when you can get a 16GB AMD card for $50 less and maximize textures. I’m sorry, but if I’m going to shell out $700, I expect to max everything out on everything I want to play, end of story, at least within the first few months for heaven’s sake.
 
The point is that there is 1 badly optimized game

Who says it’s badly optimized? Just because it requires more memory than Nvidia decided you’d need? You are aware that quite a few developers have come out and said VRAM requirements are moving higher, right?
 
Oh God! NO! Not TWO games! (one of which was coded with AMD's help and isn't released yet and hasn't been tested)

Man, setting textures down a notch with (likely) no discernable difference in quality is gonna REALLY suck for the plebs with 11GB of VRAM or less for those (maybe) two games!
Microsoft Flight Simulator >10gb -> start seeing hitching when vram is not enough. Seen over 14gb used
Crysis Remastered > 10gb -> Stalls, stops, reduced frame rate for a period of time then back to normal -> Extreme draw distances with subsequent objects/textures/shaders needing to be in vram
Doom Eternal > 8gb
Watch Dog Legions >8gb without texture pak
Godfall >12gb?

So with the RTX 3800, expect not running max settings on a number of games. Hell playing GTAV now at 1440p and it is almost 10gb as it is and that is a very old game. Some games do allocate more than needed, others allocate what is needed and suffer if vram is insufficient. RT adds an additional 1.5-2gb+ of vram requirements and we have games already pushing 8gb+ without RT. HDR adds about .5gb from my experience. If one want something that exceeds capability of Console graphics which actually has access to greater than 10gb of vram then you will need more on your GPU to allow extra resolution textures, shaders, higher level of detail, more RT etc. 12gb would be the minimum for a high end card which will most likely have issues with vram down the line. 16gb seems about right, 24gb while overkill is even better.
 
Microsoft Flight Simulator >10gb -> start seeing hitching when vram is not enough. Seen over 14gb used
Crysis Remastered > 10gb -> Stalls, stops, reduced frame rate for a period of time then back to normal -> Extreme draw distances with subsequent objects/textures/shaders needing to be in vram
Doom Eternal > 8gb
Watch Dog Legions >8gb without texture pak
Godfall >12gb?

So with the RTX 3800, expect not running max settings on a number of games. Hell playing GTAV now at 1440p and it is almost 10gb as it is and that is a very old game. Some games do allocate more than needed, others allocate what is needed and suffer if vram is insufficient. RT adds an additional 1.5-2gb+ of vram requirements and we have games already pushing 8gb+ without RT. HDR adds about .5gb from my experience. If one want something that exceeds capability of Console graphics which actually has access to greater than 10gb of vram then you will need more on your GPU to allow extra resolution textures, shaders, higher level of detail, more RT etc. 12gb would be the minimum for a high end card which will most likely have issues with vram down the line. 16gb seems about right, 24gb while overkill is even better.
Crysis remastered is a steaming pile of what not to do for a remake. Even the best computers in the world have problems with that piece of crap (mainly since it uses exactly 2 CPU threads, and bottlenecks the living shit out of powerful GPUs).
MSFS? First I've not heard of that, especially since I watched several reviewers use it with a 3080 and/or 3090 that said nothing of the sort. You sure that it's not the streaming textures from the internet that is causing that?
Doom Eternal? Why bring that up? 8GB? We are talking about a 10GB card.... right? (of note, I have played it at 4K max settings on a 2080 super 8GB (highly overclocked and watercooled), and the only slowdown in FPS that I ever saw was that kinda-cutscene with the giant dude walking past (to around 40 IIRC), and only when it was walking past - everywhere else was butter smooth).
Watch Dogs Legions? It requires 8.73GB at 4K per the settings with HD texture pack, and used up to 11.7GB during the benchmark on my 3090 (allocation vs in-use). 3080s use all 10GB, but aren't dropping to single digit FPS while textures are loaded from the pagefile (allocation vs in-use).
Godfall? Not out yet, but we'll see. I will be surprised if the visual quality changes enough to notice from the max setting (12GB "required") to the next setting down (which may, or may not, require 10GB or less). This is based on in-depth reviews showing that texture quality from the highest setting to the next highest doesn't really produce a reduction in quality in recent games even when you smash your face next to the monitor to get your eyes really close (watch dogs legions being the latest example that I've seen).
GTA V? I've not tried to load that game up in... 5 years? I'd bet you good money that you're talking about VRAM allocation vs in-use again. I'll see if I can load it up and tell you what it goes up to for me at 4K max settings. (edit - couldn't, it went unresponsive when I changed from 1080P to 4K... I deleted it and redownloaded, but it freezes while loading the benchmark. I did, however, notice a setting that upscales the image under advanced graphics settings - I could use up to 15GB the game said... lol. However, that's rendering the image at a much, much higher resolution and then downsampling to match your screen res, and not natively rendering).

To the 1.5 to 2GB needed for RT - I don't think so. At least Legions shows 700MB at 4K between off and on (any setting), and Metro Exodus uses almost exactly 1GB more at 4K with ultra RT. But you could be right on a few games, I've not seen it though.

As to HDR using any memory at all - ha! No. And I just tried turning it on looking at VRAM usage, and then off and looked at VRAM usage, in the desert area of Metro Exodus. Guess how much VRAM usage changed? 0MB. Same for Horizon Zero Dawn (this game allocates more VRAM, so you can't look around when switching it on and off to check, as it will load and offload textures that the game thinks might be needed / not needed)
Who says it’s badly optimized? Just because it requires more memory than Nvidia decided you’d need? You are aware that quite a few developers have come out and said VRAM requirements are moving higher, right?
Sorry, any console port to me automatically is a bad console port until I see otherwise - and for the record all super hero games are blah to me, and will always be. I've read that it only slows to a crawl only when you apply that aftermarket texture pack.

An example of a good console port that turned to shit, for example, is Horizon Zero Dawn. It used to be a good port until this last patch that caused increased instant travel loading times, game crashes, and of all fucking things texture pop in. The previous patch worked fine, so now it's a shitty console port (that used almost 14GB of VRAM at 4K for allocation, not all in-use). It grinds my gears because I played the game for two weeks and I was at the very end of the game when this patch came along and turned it into shit.

As to games needing more VRAM in the future - color me shocked. I thought that my old 64MB Voodoo 5 5500 AGP could do it all! (that's sarcasm)
You would have to be the ultimate fanboy to spend $699 on an Nvidia card with 10GB only to set textures down a notch when you can get a 16GB AMD card for $50 less and maximize textures. I’m sorry, but if I’m going to shell out $700, I expect to max everything out on everything I want to play, end of story, at least within the first few months for heaven’s sake.
Some folks prefer the software that Nvidia offers (shadow play, that green screen streaming thing, etc). Personally I use Nvidia because AMD has been absolute shit since the heady days of my old ATI 4870X2 the few times that I tried them after that. I tried two 6990s, but they didn't work with my high end Dell DVI monitors. I later got a couple of 7970s and one immediately, and I mean immediately, died. No, AMD has to make up for a lot of shit before I go whole hog with them (and I'm not saying that it won't happen, they just have to PROVE it before I jump over for high end gaming again). Edit - I forgot about a laptop that I had a 7970M 2GB in - it was actually pretty darn good for a laptop at the time.

On top of that, you have to be an ultimate fanboi to call other people ultimate fanbois without actually waiting for reviews.






Again, this is why I've said that people erroneously freak out over 8GB not being enough. 10GB not being enough. Soon it will be 12, etc. VRAM allocation vs in-use is really a thing, and people scream, brow-beat, and yammer about it all the time. Doesn't make them right*

*except in very, very, very few situations that usually revolve around aftermarket texture packs
 
Last edited:
Crysis remastered is a steaming pile of what not to do for a remake. Even the best computers in the world have problems with that piece of crap (mainly since it uses exactly 2 CPU threads, and bottlenecks the living shit out of powerful GPUs).
MSFS? First I've not heard of that, especially since I watched several reviewers use it with a 3080 and/or 3090 that said nothing of the sort. You sure that it's not the streaming textures from the internet that is causing that?
Doom Eternal? Why bring that up? 8GB? We are talking about a 10GB card.... right? (of note, I have played it at 4K max settings on a 2080 super 8GB (highly overclocked and watercooled), and the only slowdown in FPS that I ever saw was that kinda-cutscene with the giant dude walking past (to around 40 IIRC), and only when it was walking past - everywhere else was butter smooth).
Watch Dogs Legions? It requires 8.73GB at 4K per the settings with HD texture pack, and used up to 11.7GB during the benchmark on my 3090 (allocation vs in-use). 3080s use all 10GB, but aren't dropping to single digit FPS while textures are loaded from the pagefile (allocation vs in-use).
Godfall? Not out yet, but we'll see. I will be surprised if the visual quality changes enough to notice from the max setting (12GB "required") to the next setting down (which may, or may not, require 10GB or less). This is based on in-depth reviews showing that texture quality from the highest setting to the next highest doesn't really produce a reduction in quality in recent games even when you smash your face next to the monitor to get your eyes really close (watch dogs legions being the latest example that I've seen).
GTA V? I've not tried to load that game up in... 5 years? I'd bet you good money that you're talking about VRAM allocation vs in-use again. I'll see if I can load it up and tell you what it goes up to for me at 4K max settings. (edit - couldn't, it went unresponsive when I changed from 1080P to 4K... I deleted it and redownloaded, but it freezes while loading the benchmark. I did, however, notice a setting that upscales the image under advanced graphics settings - I could use up to 15GB the game said... lol. However, that's rendering the image at a much, much higher resolution and then downsampling to match your screen res, and not natively rendering).

To the 1.5 to 2GB needed for RT - I don't think so. At least Legions shows 700MB at 4K between off and on (any setting), and Metro Exodus uses almost exactly 1GB more at 4K with ultra RT. But you could be right on a few games, I've not seen it though.

As to HDR using any memory at all - ha! No. And I just tried turning it on looking at VRAM usage, and then off and looked at VRAM usage, in the desert area of Metro Exodus. Guess how much VRAM usage changed? 0MB. Same for Horizon Zero Dawn (this game allocates more VRAM, so you can't look around when switching it on and off to check, as it will load and offload textures that the game thinks might be needed / not needed)

Sorry, any console port to me automatically is a bad console port until I see otherwise - and for the record all super hero games are blah to me, and will always be. I've read that it only slows to a crawl only when you apply that aftermarket texture pack.

An example of a good console port that turned to shit, for example, is Horizon Zero Dawn. It used to be a good port until this last patch that caused increased instant travel loading times, game crashes, and of all fucking things texture pop in. The previous patch worked fine, so now it's a shitty console port (that used almost 14GB of VRAM at 4K for allocation, not all in-use). It grinds my gears because I played the game for two weeks and I was at the very end of the game when this patch came along and turned it into shit.

As to games needing more VRAM in the future - color me shocked. I thought that my old 64MB Voodoo 5 5500 AGP could do it all! (that's sarcasm)

Some folks prefer the software that Nvidia offers (shadow play, that green screen streaming thing, etc). Personally I use Nvidia because AMD has been absolute shit since the heady days of my old ATI 4870X2 the few times that I tried them after that. I tried two 6990s, but they didn't work with my high end Dell DVI monitors. I later got a couple of 7970s and one immediately, and I mean immediately, died. No, AMD has to make up for a lot of shit before I go whole hog with them (and I'm not saying that it won't happen, they just have to PROVE it before I jump over for high end gaming again). Edit - I forgot about a laptop that I had a 7970M 2GB in - it was actually pretty darn good for a laptop at the time.

On top of that, you have to be an ultimate fanboi to call other people ultimate fanbois without actually waiting for reviews.






Again, this is why I've said that people erroneously freak out over 8GB not being enough. 10GB not being enough. Soon it will be 12, etc. VRAM allocation vs in-use is really a thing, and people scream, brow-beat, and yammer about it all the time. Doesn't make them right*

*except in very, very, very few situations that usually revolve around aftermarket texture packs
To find out if a game is just allocating more than needed or is running out of vram => Frame time data is very useful and revealing, rapid changes in short periods of time compared to being normally smooth with lower resolutions or settings. I don't see many doing this in the review community.

When MSFS 2020 was launched it had erratic frame times with 8gb cards or less even with modest settings. Of course multiple updates has occurred which optimize this. I had MSFS up past 13gb on a Vega FE, while slow, the frame times where smooth. With lesser settings, pushing 8gb on the 5700XT the frame times were erratic until settings were greatly reduced. Nvidia 1080 Ti had no issues so is this just drivers/game software or vram or all combined? I concluded vram had a significant bearing since the same drivers were used for the 5700 XT and Vega FE.

As for GTA V, when the allocated goes above your vram amounts, it runs like shit with stutters so I think it is relatively accurate on their numbers.

As for Doom Eternal, reviewers have noticed if using max textures at 4K it can give problems with 8gb cards not even on max settings.

As for increase vram requirements for using raytracing, games are pushing over 8gb without even using raytracing now:

https://developer.nvidia.com/blog/rtx-best-practices/
Q. How much extra VRAM does a typical ray-tracing implementation consume?
A. Today, games implementing ray-tracing are typically using around 1 to 2 GB extra memory. The main contributing factors are acceleration structure resources, ray tracing specific screen-sized buffers (extended g-buffer data), and driver-internal allocations (mainly the shader stack).

As for HDR, do you have 10bit turned on in your drivers or 8bit or using dithering? Shadow Of The Tomb Raider is where I got those numbers. May have to recheck. Nvidia does some funky stuff with HDR, while AMD just does it right. In addition what is frustrating with Nvidia is when using HDR, Gsync on a FreeSync monitor is not available -> AMD has no problem both with HDR and FreeSync running at the same time. Now HDR capable GSync module monitors don't have this problem.

So you based your opinion on AMD is shit based on your experience on a card launched 8 years ago or longer - gocha. Well I could say the same with the 8800 GTX and using Vista initial stages - pure shit - crash after crash -> All has zero bearing on current drivers and hardware and we don't know yet for next generation from AMD. Then you said something about fanboy. I just recommend look at reviews, people experience etc. on any product before buying, who ever made it.
 
To find out if a game is just allocating more than needed or is running out of vram => Frame time data is very useful and revealing, rapid changes in short periods of time compared to being normally smooth with lower resolutions or settings. I don't see many doing this in the review community.

When MSFS 2020 was launched it had erratic frame times with 8gb cards or less even with modest settings. Of course multiple updates has occurred which optimize this. I had MSFS up past 13gb on a Vega FE, while slow, the frame times where smooth. With lesser settings, pushing 8gb on the 5700XT the frame times were erratic until settings were greatly reduced. Nvidia 1080 Ti had no issues so is this just drivers/game software or vram or all combined? I concluded vram had a significant bearing since the same drivers were used for the 5700 XT and Vega FE.

As for GTA V, when the allocated goes above your vram amounts, it runs like shit with stutters so I think it is relatively accurate on their numbers.

As for Doom Eternal, reviewers have noticed if using max textures at 4K it can give problems with 8gb cards not even on max settings.

As for increase vram requirements for using raytracing, games are pushing over 8gb without even using raytracing now:



As for HDR, do you have 10bit turned on in your drivers or 8bit or using dithering? Shadow Of The Tomb Raider is where I got those numbers. May have to recheck. Nvidia does some funky stuff with HDR, while AMD just does it right. In addition what is frustrating with Nvidia is when using HDR, Gsync on a FreeSync monitor is not available -> AMD has no problem both with HDR and FreeSync running at the same time. Now HDR capable GSync module monitors don't have this problem.

So you based your opinion on AMD is shit based on your experience on a card launched 8 years ago or longer - gocha. Well I could say the same with the 8800 GTX and using Vista initial stages - pure shit - crash after crash -> All has zero bearing on current drivers and hardware and we don't know yet for next generation from AMD. Then you said something about fanboy. I just recommend look at reviews, people experience etc. on any product before buying, who ever made it.
Well, since you wrote all that and didn't mention anything about the two tools that might just tell you in-use vs allocation, I'll leave you to your own devices.

Good luck to you in the future, and good bye.
 
I'm a 4k gamer. Godfall is the game I've been anticipating the most this year. They say 12GB potentially needed at highest textures. Sucks too because no way in hell I can land an AMD card in the next week. :(

Have you considered that DLSS 2.0 (I suspect) lowers vram needs of games which support it?

Smart thing to do is wait for reviews from reputable reviewers to cover that game with both brands of GPU before coming to a conclusion.

As far as a 20Gb 3080Ti, I think this was known before they launched that one would be coming. They needed something to fit in between the 3080 and the Titan (3090) pricing. How soon we will get it who knows, Ti's have released alongside the 2080 cards, or been 10 months behind (1080). If there is good competition, sooner rather than later would be a good guess.
 
Have you considered that DLSS 2.0 (I suspect) lowers vram needs of games which support it?

Smart thing to do is wait for reviews from reputable reviewers to cover that game with both brands of GPU before coming to a conclusion.

As far as a 20Gb 3080Ti, I think this was known before they launched that one would be coming. They needed something to fit in between the 3080 and the Titan (3090) pricing. How soon we will get it who knows, Ti's have released alongside the 2080 cards, or been 10 months behind (1080). If there is good competition, sooner rather than later would be a good guess.

Have YOU considered that hardly any games support DLSS 2.0? DLSS isn't a feature. It's a perk in the handful of games that support it.
 
Have YOU considered that hardly any games support DLSS 2.0? DLSS isn't a feature. It's a perk in the handful of games that support it.
Dlss is bafoonery and trickery

Voodoo magic

Its like claiming one of those engines that stop and recrank at every red light saves gas
 
This thread seems to have derailed quite a bit, but to the OP and actual thread topic: I think if the 3080 TI comes to fruition it will either be a shame to the TI variants OR another scenario like Kepler. Here's why:

With the 3080 and 3090 already set in stone yet the performance gap so narrow between the two, there is hardly any breathing room for a 3080 TI to make any kind of sense. UNLESS it's simply a 20GB 3080 with maybe an overclock, OR it has increased VRAM and shaders thus making it faster than even 3090. Sounds a lot like the 780 vs 780 TI vs Titan mess. Nvidia pissed off a lot of Nvidia customers that bought the Titan only to get a slap in the face with the 780 TI.
 
Crysis remastered is a steaming pile of what not to do for a remake. Even the best computers in the world have problems with that piece of crap (mainly since it uses exactly 2 CPU threads, and bottlenecks the living shit out of powerful GPUs).
MSFS? First I've not heard of that, especially since I watched several reviewers use it with a 3080 and/or 3090 that said nothing of the sort. You sure that it's not the streaming textures from the internet that is causing that?
Doom Eternal? Why bring that up? 8GB? We are talking about a 10GB card.... right? (of note, I have played it at 4K max settings on a 2080 super 8GB (highly overclocked and watercooled), and the only slowdown in FPS that I ever saw was that kinda-cutscene with the giant dude walking past (to around 40 IIRC), and only when it was walking past - everywhere else was butter smooth).
Watch Dogs Legions? It requires 8.73GB at 4K per the settings with HD texture pack, and used up to 11.7GB during the benchmark on my 3090 (allocation vs in-use). 3080s use all 10GB, but aren't dropping to single digit FPS while textures are loaded from the pagefile (allocation vs in-use).
Godfall? Not out yet, but we'll see. I will be surprised if the visual quality changes enough to notice from the max setting (12GB "required") to the next setting down (which may, or may not, require 10GB or less). This is based on in-depth reviews showing that texture quality from the highest setting to the next highest doesn't really produce a reduction in quality in recent games even when you smash your face next to the monitor to get your eyes really close (watch dogs legions being the latest example that I've seen).
GTA V? I've not tried to load that game up in... 5 years? I'd bet you good money that you're talking about VRAM allocation vs in-use again. I'll see if I can load it up and tell you what it goes up to for me at 4K max settings. (edit - couldn't, it went unresponsive when I changed from 1080P to 4K... I deleted it and redownloaded, but it freezes while loading the benchmark. I did, however, notice a setting that upscales the image under advanced graphics settings - I could use up to 15GB the game said... lol. However, that's rendering the image at a much, much higher resolution and then downsampling to match your screen res, and not natively rendering).

To the 1.5 to 2GB needed for RT - I don't think so. At least Legions shows 700MB at 4K between off and on (any setting), and Metro Exodus uses almost exactly 1GB more at 4K with ultra RT. But you could be right on a few games, I've not seen it though.

As to HDR using any memory at all - ha! No. And I just tried turning it on looking at VRAM usage, and then off and looked at VRAM usage, in the desert area of Metro Exodus. Guess how much VRAM usage changed? 0MB. Same for Horizon Zero Dawn (this game allocates more VRAM, so you can't look around when switching it on and off to check, as it will load and offload textures that the game thinks might be needed / not needed)

Sorry, any console port to me automatically is a bad console port until I see otherwise - and for the record all super hero games are blah to me, and will always be. I've read that it only slows to a crawl only when you apply that aftermarket texture pack.

An example of a good console port that turned to shit, for example, is Horizon Zero Dawn. It used to be a good port until this last patch that caused increased instant travel loading times, game crashes, and of all fucking things texture pop in. The previous patch worked fine, so now it's a shitty console port (that used almost 14GB of VRAM at 4K for allocation, not all in-use). It grinds my gears because I played the game for two weeks and I was at the very end of the game when this patch came along and turned it into shit.

As to games needing more VRAM in the future - color me shocked. I thought that my old 64MB Voodoo 5 5500 AGP could do it all! (that's sarcasm)

Some folks prefer the software that Nvidia offers (shadow play, that green screen streaming thing, etc). Personally I use Nvidia because AMD has been absolute shit since the heady days of my old ATI 4870X2 the few times that I tried them after that. I tried two 6990s, but they didn't work with my high end Dell DVI monitors. I later got a couple of 7970s and one immediately, and I mean immediately, died. No, AMD has to make up for a lot of shit before I go whole hog with them (and I'm not saying that it won't happen, they just have to PROVE it before I jump over for high end gaming again). Edit - I forgot about a laptop that I had a 7970M 2GB in - it was actually pretty darn good for a laptop at the time.

On top of that, you have to be an ultimate fanboi to call other people ultimate fanbois without actually waiting for reviews.






Again, this is why I've said that people erroneously freak out over 8GB not being enough. 10GB not being enough. Soon it will be 12, etc. VRAM allocation vs in-use is really a thing, and people scream, brow-beat, and yammer about it all the time. Doesn't make them right*

*except in very, very, very few situations that usually revolve around aftermarket texture packs

Or Nvidia could have just used more than 10GB of VRAM. I’m not sure why you’re on such a mission to convince everyone it’s perfectly fine when there are several titles now that demonstrably shown that 10GB has become the very minimum for high resolution gaming and is insufficient in some cases. Meanwhile, AMD shows up with 16GB. It really is ok to admit Nvidia screwed up here. Don’t worry, I won’t tell Jensen.
 
Or Nvidia could have just used more than 10GB of VRAM. I’m not sure why you’re on such a mission to convince everyone it’s perfectly fine when there are several titles now that demonstrably shown that 10GB has become the very minimum for high resolution gaming and is insufficient in some cases. Meanwhile, AMD shows up with 16GB. It really is ok to admit Nvidia screwed up here. Don’t worry, I won’t tell Jensen.
I see you didn't bother to read the article either.

Good luck to you in the future, and good bye.
 
Back
Top