GeForce GTX 1080 Ti Evaluation Thoughts

The 1080 Ti will be a hair faster than the Titan XP and the Titan XP is a whopping 80% faster than the 980 ti so not sure why some of you are acting like its a measly 40-50% difference. Even when both the 980 ti and 1080 ti are oced the 1080 ti should be around 70 to 75% faster.

https://www.techpowerup.com/reviews/NVIDIA/Titan_X_Pascal/24.html
It does seem so, my binary compute module suffered a cosmic ray inversion.
I expect only around 150 to 200MHz OC on the 1080ti though, its starting clock will be quite high.
The much lower clocked (standard) 980ti does a lot more.
I expect the 980ti will catch up around 15% once clocked.

I dont mind being wrong cos I'm getting one :)
 
Yes for God sakes don't run Ryzen for 1080 Ti evaluation. We want to see real man's gaming results. :D


clint-eastwood-nodding.gif
 
The 1080 Ti will be a hair faster than the Titan XP and the Titan XP is a whopping 80% faster than the 980 ti so not sure why some of you are acting like its a measly 40-50% difference. Even when both the 980 ti and 1080 ti are oced the 1080 ti should be around 70 to 75% faster.

https://www.techpowerup.com/reviews/NVIDIA/Titan_X_Pascal/24.html

I really don't know what you've been reading but the review you linked states that the Titan XP is around 40-45% faster than the 980 Ti in 1440p and 4K, even less at lower resolutions. That is probably compared to a reference 980 Ti which is maybe 10-20% slower than a well overclocked one. The 980 Ti was so great specifically because it overclocked well. The Pascals seem to overclock worse by comparison or maybe they are already clocked less conservatively than the 980 Ti was.

Now 40-45% is of course a pretty big improvement and enough to push 4K to over 60 fps with conservative AA options and will run 1440p at max with very high framerates all day. If the 1080 Ti manages to be 50% faster than the 980 Ti then that is pretty awesome too.

I am a bit worried about European pricing. The over 800€ launch prices for the 1080 were hard to swallow for what was a slightly over 20% speed bump over an overclocked 980 Ti. Even now they seem to sell for around 600-800€ with the top end reserved for well known vendors like Gigabyte, Asus and MSI. That might be without any price drops in effect. If the GTX 1080 Ti ends up costing over 1000 euros then I might pass and wait for Volta instead.
 
I really don't know what you've been reading but the review you linked states that the Titan XP is around 40-45% faster than the 980 Ti in 1440p and 4K, even less at lower resolutions. That is probably compared to a reference 980 Ti which is maybe 10-20% slower than a well overclocked one. The 980 Ti was so great specifically because it overclocked well. The Pascals seem to overclock worse by comparison or maybe they are already clocked less conservatively than the 980 Ti was.

Now 40-45% is of course a pretty big improvement and enough to push 4K to over 60 fps with conservative AA options and will run 1440p at max with very high framerates all day. If the 1080 Ti manages to be 50% faster than the 980 Ti then that is pretty awesome too.

I am a bit worried about European pricing. The over 800€ launch prices for the 1080 were hard to swallow for what was a slightly over 20% speed bump over an overclocked 980 Ti. Even now they seem to sell for around 600-800€ with the top end reserved for well known vendors like Gigabyte, Asus and MSI. That might be without any price drops in effect. If the GTX 1080 Ti ends up costing over 1000 euros then I might pass and wait for Volta instead.
You are not reading the charts correctly. For example the 980 ti is giving 56% of the performance of the Titan X at 4k so that makes the Titan X 80% faster at that res.
 
Please for the love of god, when you do your 4k benchmarks DO NOT RUN AA. We want to see 4k ultra results without AA because NO ONE runs AA at 4k. So many reviewers including AA in there benches which doesn't give accurate results.
 
Please for the love of god, when you do your 4k benchmarks DO NOT RUN AA. We want to see 4k ultra results without AA because NO ONE runs AA at 4k. So many reviewers including AA in there benches which doesn't give accurate results.

I tend to agree there. I think you could show a cutoff with color gradient or other demarkation on the same graphed bar to show where the AA limit is. I'd actually like it to go further than that. In one of my replies here I said I'd rather see separate bars graphed for a 100fps-hz average target for high hz monitor use, with the actual graphics settings being the graphed variable bars/sliders... and also a 60fps-hz average target for people interested in that.
An example below , not actual values just thrown together for an example.
.
.

o6MkNGL.png
 
  • Like
Reactions: 50Cal
like this
Please for the love of god, when you do your 4k benchmarks DO NOT RUN AA. We want to see 4k ultra results without AA because NO ONE runs AA at 4k. So many reviewers including AA in there benches which doesn't give accurate results.

I run max AA at 4k XD
 
How would you feel about a mixed review, some 1440p, some 4K ?

I feel even if we do include some 1440p, 4K really needs to be represented to challenge the video card graphically. Some games aren't much of a challenge, until you try to run them at 4K.

I would still love to see 1440p. There are games out there like Watch Dogs 2 where even on a 1080 you would have to turn down graphical features to hit a smooth frame rate in 1440p. I'm really curious to see if the TI lets you crank all the settings up in most modern games and still maintain 60fps +.
 
I really don't know what you've been reading but the review you linked states that the Titan XP is around 40-45% faster than the 980 Ti in 1440p and 4K, even less at lower resolutions. That is probably compared to a reference 980 Ti which is maybe 10-20% slower than a well overclocked one. The 980 Ti was so great specifically because it overclocked well. The Pascals seem to overclock worse by comparison or maybe they are already clocked less conservatively than the 980 Ti was.

Now 40-45% is of course a pretty big improvement and enough to push 4K to over 60 fps with conservative AA options and will run 1440p at max with very high framerates all day. If the 1080 Ti manages to be 50% faster than the 980 Ti then that is pretty awesome too.

I am a bit worried about European pricing. The over 800€ launch prices for the 1080 were hard to swallow for what was a slightly over 20% speed bump over an overclocked 980 Ti. Even now they seem to sell for around 600-800€ with the top end reserved for well known vendors like Gigabyte, Asus and MSI. That might be without any price drops in effect. If the GTX 1080 Ti ends up costing over 1000 euros then I might pass and wait for Volta instead.

EU pricing? We just have prices that always include a VAT of around 20% (depending on country), unlike US prices (they also get taxes but it works very differently). I pre-ordered one from nvidia for 824€. Without VAT that's... 687€. Not much more than $699 (660.5€).

Also, the improvements over a 980 ti seem bigger in many cases than what you are saying. In this review https://www.techpowerup.com/reviews/NVIDIA/Titan_X_Pascal/ look closely at Witcher 3, RotR & GTA V for example.

At 1440p or 4k, the Titan X is frequently almost twice as fast as the stock 980 ti. That is insanely good. Just the same you could find instances where a 1080 is like 50% faster than a 980 ti rather than just the average 20-30%. Architecture improvements, software, whatever. No matter how much you can overclock a 980 ti, it's older tech ;)
 
Last edited:
As I guess 1600P would be out of the question, I would like to see a 180 degree wrap around of three (curved?) 3440x1440 screens in landscape and three 4K screens in portrait.
 
I just need a card to run my triple S2417 monitors at 80-100fps. Am i asking too much of the 1080ti?

I have one S2417 and I'm probably getting 2 more to rebuild my NV Surround system.

I'm torn between getting 2 GPUs or 1 GPU and a new CPU/Motherboard

It's very difficult to find info on running 3x 1440p monitors but I found a guy on Youtube that has done testing with 980Tis, 1080s and Titan XP (single and SLI). I figure the 1080Ti will be withing 5-10% of the Titan X - here is his single card vs SLI bench video:

1 card does surprisingly well and you can always run single screen on demanding games.

 
Please include the 1070, as I feel many 1070 owners (myself included) are on the fence about the upgrade.

Also 3440x1440 would be sweeeeet
 
I tend to agree there. I think you could show a cutoff with color gradient or other demarkation on the same graphed bar to show where the AA limit is. I'd actually like it to go further than that. In one of my replies here I said I'd rather see separate bars graphed for a 100fps-hz average target for high hz monitor use, with the actual graphics settings being the graphed variable bars/sliders... and also a 60fps-hz average target for people interested in that.
An example below , not actual values just thrown together for an example.
.
.

o6MkNGL.png
This would be perfect.
 
Been gaming on 4k for the last 2 years - albeit in varying degrees of happiness depending on the game.

While I am an AMD fanboy, if RX-Vega doesn't deliver at 4k then I'm going back to nVidia after a 10 year break (GeForce 6800 was my last nVidia card).
 
EU pricing? We just have prices that always include a VAT of around 20% (depending on country), unlike US prices (they also get taxes but it works very differently). I pre-ordered one from nvidia for 824€. Without VAT that's... 687€. Not much more than $699 (660.5€).

Also, the improvements over a 980 ti seem bigger in many cases than what you are saying. In this review https://www.techpowerup.com/reviews/NVIDIA/Titan_X_Pascal/ look closely at Witcher 3, RotR & GTA V for example.

At 1440p or 4k, the Titan X is frequently almost twice as fast as the stock 980 ti. That is insanely good. Just the same you could find instances where a 1080 is like 50% faster than a 980 ti rather than just the average 20-30%. Architecture improvements, software, whatever. No matter how much you can overclock a 980 ti, it's older tech ;)
In my own experience the Titan X is around 70% faster than the GTX Titan X in all the games I have played at 2560x1440. The performance variance between the 980 Ti and 1080 Ti will be about the same.
 
Consider NV is claiming this video card is faster than a TITAN X Pascal, and having used a TITAN X Pascal before, it kicked the crap out of 1440p, only 4K challenged it.

All the more reason to test the claim. I personally just bumped to 1440p a few months ago. I am more interested in 1440p @ 120Hz and 144Hz than I am in 4K @ 60 Hz.
 
I agree and I'm really only interested in 120hz+, however I'd like to see 4k's fps average results since 144hz 4k monitors are due out on dp 1.4 later this year. I could just take a guess from titan pascal benchmarks compared to other cards though I suppose.

I'd say Titan X pascal at 2560 x 1440 is strong and a good fit for most demanding games currently, in regard to using a high hz monitor. I wouldn't say it kicks the crap out of them at the highest in game settings.
Witcher 3 titan x pascal 2560 x1440 hairworks off
= 96.6 fps to 106 fps-hz average (70 - 100ish - 130+) band

.....which is pretty much the minimum *target average* for using a high hz monitor with variable hz/g-sync capability in order to get any appreciable motion clarity and motion definition increases. It's really impressive for a single card to get that kind of frame rate at settings of HQ-AF + AA (minus hairworks) on such a demanding game at 1440p. A gtx 1080 gets 82 fps average so dialed in (down) from max to very high might allow it to hit the minimum target of 100fps-hz average too.

At 4k rez it drops drastically to 57.6 - 64fps (~ 30 - 60 - 90) band, at the highest settings (other than hairworks).. Of course you could dial it in (down) considerably to get a higher frame rate band.


Fallout4 2560 x 1440 Titan X Pascal

= 117.7 fps-hz average ( likely 90 - 118 - 150 band)

..... a great target band for a 144hz monitor with variable hz/g-sync capability
........ throughout the frame rate graph band:
............a blend of 30% - 50% - 60% blur reduction and 1.x:1 to 2:1 and 2.4:1 motion definition increase compared to 60fps-hz


GTA 5 2560 x 1440 Titan X Pascal dialed in to VERY HIGH with MSAA off
= 120.7 fps-hz average ( ~ 90 - 120 - 150+ band)

... at 4k rez (on very high and MSAA off) it drops down drastically to
= 62.8 fps avg (likely 30 - 63 - 90 band), which gets essentially nothing out of future dp 1.4, 4k 144hz monitor's high hz without dropping the graphics settings down a lot.

Far Cry Primal Titan pascal: 2560 x1440 93.9 fps-hz average ... 4k 51.3 fps-hz average

I can understand the enthusiasm for such a powerful card getting such good results. However, "Kicking the crap out of it" in my view would keep the frame rate band in a game much higher giving a lot more more room for max settings (global shadow settings, view distances, animated objects view-able in distance, textures, FX quality, number of whatever shown on screen at any given time, etc.) and perhaps even more headway for mods or official/unofficial texture packs, and/or things like hairworks, phys-x levels - making a game more demanding while allowing you to stay in the 100fps-hz + avg realm since you "kicked the crap out of the game" at regular gfx heights. I guess in fighting terms I'd call it "competitive" at 2560 x 1440 with the over the top settings disabled. I chose some of the most demanding games as examples though, from reviews at some of the highest settings. ;)
 
Last edited:
I agree and I'm really only interested in 120hz+, however I'd like to see 4k's fps average results since 144hz 4k monitors are due out on dp 1.4 later this year. I could just take a guess from titan pascal benchmarks compared to other cards though I suppose.

I'd say Titan X pascal at 2560 x 1440 is strong and a good fit for most demanding games currently, in regard to using a high hz monitor. I wouldn't say it kicks the crap out of them at the highest in game settings.


I can understand the enthusiasm for such a powerful card getting such good results. However, "Kicking the crap out of it" in my view would keep the frame rate band in a game much higher giving a lot more more room for max settings (global shadow settings, view distances, animated objects view-able in distance, textures, FX quality, number of whatever shown on screen at any given time, etc.) and perhaps even more headway for mods or official/unofficial texture packs, and/or things like hairworks, phys-x levels - making a game more demanding while allowing you to stay in the 100fps-hz + avg realm since you "kicked the crap out of the game" at regular gfx heights. I chose some of the most demanding games as examples though. ;)

I'm not expecting Ti to peg high hz 4K and would be shocked if it did. I kind of thought that would be a common opinion. I'm actually starting to get curious after seeing the enthusiasm. I had shelved the idea of single card fast 4K at cranked settings to be a next gen thing only and stopped considering a 4K monitor until then. After playing around with a predator above 120 now though, I'm hooked and not going back. Looking forward to the day that's higher res though.
 
  • Like
Reactions: elvn
like this
I'm not expecting Ti to peg high hz 4K and would be shocked if it did. I kind of thought that would be a common opinion. I'm actually starting to get curious after seeing the enthusiasm. I had shelved the idea of single card fast 4K at cranked settings to be a next gen thing only and stopped considering a 4K monitor until then. After playing around with a predator above 120 now though, I'm hooked and not going back. Looking forward to the day that's higher res though.

For myself, I can live with the "4K video card tradeoff". I figure that 4K adopters like myself are like the 2560x1600 (1600p) monitor adopters back in 2006-2007 -- ended up being a while before the cards caught up to the resolution in terms of graphically demanding, top-tier gaming.
 
For myself, I can live with the "4K video card tradeoff". I figure that 4K adopters like myself are like the 2560x1600 (1600p) monitor adopters back in 2006-2007 -- ended up being a while before the cards caught up to the resolution in terms of graphically demanding, top-tier gaming.

Yeah, and more power to ya. I wish the 1600p monitors had held more traction, I just prefer that aspect ratio and probably my biggest disappointment with 1440 but the fast monitors are great for me and I'm sorry I waited so long to switch. I don't think I'm ever going back to 60hz though for gaming.
 
Yeah, and more power to ya. I wish the 1600p monitors had held more traction, I just prefer that aspect ratio and probably my biggest disappointment with 1440 but the fast monitors are great for me and I'm sorry I waited so long to switch. I don't think I'm ever going back to 60hz though for gaming.

I hear ya, I gave my Dell 3007WFP-HC to my cousin 2 years ago and it's still running just fine. Thing is built like a tank. Even when I look and compare it to my 42" 4K desktop TV, it's still a good size, especially with that aspect ratio.
Now, 4K@120/144hz (G-Sync/FreeSync) w/VA or IPS panel? That's the next logical upgrade for me, I guess.
 
I hear ya, I gave my Dell 3007WFP-HC to my cousin 2 years ago and it's still running just fine. Thing is built like a tank. Even when I look and compare it to my 42" 4K desktop TV, it's still a good size, especially with that aspect ratio.
Now, 4K@120/144hz (G-Sync/FreeSync) w/VA or IPS panel? That's the next logical upgrade for me, I guess.

Amen, make 34-37 inches and I'm in.
 
This thread is full of win, though generally not in a good way but more like a derailed train. Would lurk again.

After playing at 144Hz, going back to even 120Hz gives me some discomfort. I'm curious how 200Hz will look. It's quite tempting, lol. The smoothness is just out of this world.

The biological explanation for being "able" to tell the difference between 7ms and 8ms* is called asperger's, and has nothing to do with your optic nerve.

*forgot my audience, 6.94444... and 8.333... ms
 
Pretty meaningless unless you feed it new, unique frames of action anyway.
We are just getting to the point (since titan pascal) where on the most demanding games a top end single gpu can do 100fps-hz average or so at 2560 x1440, sometimes requiring some features to be turned off in the game settings at that. And that is an average in a sort of vibrating blend +/- 30fps all over the place seconds to seconds so has half of the mix being 70 to 100 or so depending on the game(see example graph). You don't get really appreciable blur reduction and motion definition increases until you are around at least 100fps-hz (fps and hz) average so much below that you are getting no appreciable benefit out of a high hz monitor even if it was 300hz. Some old source games and other simpler games and some isometrics can get well over 200fps average though of course.
 
Back
Top