Leaked GeForce GTX 680 2GB Benchmarks

Status
Not open for further replies.
Simple reason:
The performance of the GTX680 seems to be unworthy of a high end part that deserves the name "680". 30-40% faster than the GTX580 (apparently) is maybe a tad strong for a 660Ti card, but not enough for a 680. At least if you follow the past 6 years where the high end part provided a performance increase of at least 60%.

Um, going from 8800GTX/GT/GTS series to 9800 series was about a 15-30% increase. I don't see how that is 'proof' that this is a midrange card. Anyone saying that it is a midrange card is only fooling themselves. If it is priced like a high-end card, then it is a high-end card.
 
IMO .. I don't think your going see very more headroom in overclocking because it's already overclocked by some 275Mhz in those benchmarks with turbo boost and why the total system power load is right there with the others as with Nvidia cards it's more tied to the sharder clock then gpu clock as once it hits the shader clock limit thats all she wrote..

It appears that the normal clock speed is 1006, with a turbo of 1056 - the 7xx speed that was floating around was a reporting error from GPU-z. If you look at the newer GPU-z shots, it shows a default clock of 1006. So that shouldn't impact overclocking performance (although the turbo mode itself might).

Um, going from 8800GTX/GT/GTS series to 9800 series was about a 15-30% increase. I don't see how that is 'proof' that this is a midrange card. Anyone saying that it is a midrange card is only fooling themselves. If it is priced like a high-end card, then it is a high-end card.

People are calling it the mid-range card because it carries the mid-range name (GK104) and it has mid-range specs (256-bit memory bus, 2GB VRAM, 6+6 power), and it was initially slated (it appears) to be the mid-range part with the GK110 chip filling the high-end. Then Nvidia decided to make this the high-end part, either because the GK110 isn't close to being ready, or because they realized they could compete with the 7970 with this cheaper-to-produce chip. In any case, as you pointed out, it is now the de facto high-end part, no matter what it was originally intended to be (and my guess is that Nvidia execs are high-fiving each other that they can charge $500 for this card).
 
Last edited:
It is a high end part. Nvidia fans want to believe different.....

And no matter what you say will change their minds.....
 
You have no clue whether performance will get better, wait for the real reviews. They will be using a first release driver, but maybe future, subsequent driver revisions will make it pull ahead even further? The mind boggles.

Well, we have seen a real review, haven't we? Although quite biased, the THG leak is legit. I don't believe in magic drivers. If the 680 can get 10% from drivers, so can Tahiti. And even if the distance to the 580 is increased by 10% it still is not enough for a jump like the 8800->280 or 280->480.

Um, going from 8800GTX/GT/GTS series to 9800 series was about a 15-30% increase. I don't see how that is 'proof' that this is a midrange card. Anyone saying that it is a midrange card is only fooling themselves. If it is priced like a high-end card, then it is a high-end card.

Yeah, the 9800 wasn't a worthy high end card either. This is no valid reference point.
Look at the TDP, look at the memory bus/size and die size and actual performance. This IS a beefed up midrange card. Price is so arbitrary. It largely reflects supply and demand and the competitive landscape. But in no way has it to be tied directly to the capabilities of the product itself. +30-40% over the last top dog is just too little.

Btw, I'm a bit biased...towards Nvidia ;)
 
Last edited:
So based solely on those leaked benchmarks which may or may not be real, I'll be skipping this generation and sticking with my GTX580.
 
This is certainly an interesting development.

It seems both AMD and NVIDIA have had problems with manufacturing at TSMC at this node (as well as at previous nodes e.g late 480). I wonder why they still stick with TSMC, are they cheaper, or is there simply no alternative manufacturers?

So NVIDIA was planning on GK100/110/112 to be the 680. The initial manufacturing runs were a load of crap so NVIDIA went back to drawing board to fix them, and decides to push back release. At the same time the GK104 seems to be doing OK due to smaller die size.

At the same time AMD was also having problems but decided to push ahead and hope the yield improves and also push back their release but without a new design.

Eventually AMD gets enough cards manufactured that they decide they will release the 7970 before NVIDIA steals the limelight. They decide to take a gamble and leave the GPU underclocked for the moment and put a large price tag on it.

NVIDIA goes 'oh crap' we are getting destroyed and AMD are making a decent profit. They have a brainstorm session and someone goes, lets OC the crap out of our GK104 (midrange) 660 chip and see if it 'beats' the 7970 since we have lots of GK104 lying around. Well it turned out it did (at 1080p) so they decide to rename the 660 to 680 and release it quickly.

The question is, what will happen now?

Will AMD respond with price drop or release an improved 7970 ie 7980 or something that has much higher clocks to push it out of range of the 680 at 1080p? (And stop selling the 7970)

Will NVIDIA eventually release the GK110/112 chip for the desktop gaming market (instead of reserving it for Tesla)? If NVIDIA do then I am sure it will completely wipe the floor of the 7970/7980 considering how close GK104 is to the 7970. I only hope they throw in lots of VRAM with the GK110/112.

One thing is for sure... I need to plenty of popcorn this year as the GPU battle is going to get very exciting over the next few months.

(And lets not forget that those poor console gamers are still stuck with 2004 GPUs for at least 1-2 more years going by rumors).
 
So based solely on those leaked benchmarks which may or may not be real, I'll be skipping this generation and sticking with my GTX580.

Uhhh...ok

It is a high end part. Nvidia fans want to believe different.....

And no matter what you say will change their minds.....

Uhhh...ok


This is certainly an interesting development.

It seems both AMD and NVIDIA have had problems with manufacturing at TSMC at this node (as well as at previous nodes e.g late 480). I wonder why they still stick with TSMC, are they cheaper, or is there simply no alternative manufacturers?

So NVIDIA was planning on GK100/110/112 to be the 680. The initial manufacturing runs were a load of crap so NVIDIA went back to drawing board to fix them, and decides to push back release. At the same time the GK104 seems to be doing OK due to smaller die size.

At the same time AMD was also having problems but decided to push ahead and hope the yield improves and also push back their release but without a new design.

Eventually AMD gets enough cards manufactured that they decide they will release the 7970 before NVIDIA steals the limelight. They decide to take a gamble and leave the GPU underclocked for the moment and put a large price tag on it.

NVIDIA goes 'oh crap' we are getting destroyed and AMD are making a decent profit. They have a brainstorm session and someone goes, lets OC the crap out of our GK104 (midrange) 660 chip and see if it 'beats' the 7970 since we have lots of GK104 lying around. Well it turned out it did (at 1080p) so they decide to rename the 660 to 680 and release it quickly.

The question is, what will happen now?

This person understands what happened over the last few months and why we are at this point. I commend you sir.

If AMD was able to stabilize the card at 1-1.1GHz with low power draw as planned, this GK104 "Faux GTX 680" wouldn't see the market right now. The GK110 would have been planned to be released instead, but much later.
AMD left the door open for Nvidia to release something. With the intended 7970, nV still would be at the drawing board.
 
Last edited:
Why does everyone say 1080p, are there that many interlaced monitors around that we need the p in there?
 
Why does everyone say 1080p, are there that many interlaced monitors around that we need the p in there?

1080p can denote the full resolution with out typing it out and could also refer to those who use their TV as their monitor, I used to do this for years before I went back to a conventional monitor.
 
I'm sorry, but you don't pay that much for a mid range card. It's not mid range. I'm not impressed, glad I didn't wait for these, Loving my 3gbs of Vram and I'm sure my 7970 is going to outperform this card in 3 screens anyways. This makes me happy.
 
Well, we have seen a real review, haven't we? Although quite biased, the THG leak is legit. I don't believe in magic drivers. If the 680 can get 10% from drivers, so can Tahiti. And even if the distance to the 580 is increased by 10% it still is not enough for a jump like the 8800->280 or 280->480.



Yeah, the 9800 wasn't a worthy high end card either. This is no valid reference point.
Look at the TDP, look at the memory bus/size and die size and actual performance. This IS a beefed up midrange card. Price is so arbitrary. It largely reflects supply and demand and the competitive landscape. But in no way has it to be tied directly to the capabilities of the product itself. +30-40% over the last top dog is just too little.

Btw, I'm a bit biased...towards Nvidia ;)

a tleast you admit it :) . but seriously not to pick on you/world exclusive but if the 680 is only mid range regardless of price because some people don't consider it a strong enough performer and or because there were reports of somthing faster in the works that never materialized for whatever reason then I also declare the 7970 to be a mid range card becaus it didn't give as big a performance leap over the 580 as I would have liked and supposedly amd was going to make a model with 2300 shaders but scrapped it for reasons that we don't know. so I guess Nvidia's incredibly expensive mid range card and AMD's incredibly expensive mid range card are running neck and neck. but man when those real high end cards come out at $900 they will blow them all away :)

(please note from everything I have seen both the 7970 and 680 are amazing cards just trying to point out that calling one high end and the other midrange and when they are both priced about the same and declaring the "midrange" card the "winner" just because it's "midrange" is rather silly. seriously guys lets put all the fan boy crap aside these are both great gpu's with similar performance at a similar price and if either company had some amazing ace up their sleeve that was 3 bajillion times better and could be had in exchange for a double cheeseburger and small fry they would be releasing it. )
 
Fair enough sir in that case I agree with your point and will now /crossfingers for a price war to drop the price on both cards down closer to the $400 range. ofc the real winners are probably the people scooping up gtx 580s for $300ish bucks and 6970's for 200$.
 
Why does everyone say 1080p, are there that many interlaced monitors around that we need the p in there?

As a matter of fact my Samung 1920x1080 screen can operate in both 1080p and 1080i. It shows up as 60Hz for 1080p and 30Hz for 1080i in the resolution control panel.

(this screen was designed take HDMI and another model has a built in DVB-T tuner)

EDIT: I should mention that using the desktop or gaming in 1080i is a very weird experience.
 
Last edited:
So basically....

Dual-GPU Cards (590 & 6990) > GTX680 > 7970 > GTX580

I was kinda hoping Kepler would beat the 590....
 
It seems both AMD and NVIDIA have had problems with manufacturing at TSMC at this node (as well as at previous nodes e.g late 480). I wonder why they still stick with TSMC, are they cheaper, or is there simply no alternative manufacturers?

Global Foundries is really the only other option, and for whatever reasons, both NV and AMD don't see any benefit from switching to them (and don't even try to mention "well why can't they just use both at the same time!" Processes don't transfer from one foundry to the next).
 
So basically....

Dual-GPU Cards (590 & 6990) > GTX680 > 7970 > GTX580

I was kinda hoping Kepler would beat the 590....

More like Dual-GPU Cards (590 & 6990) > 7970 > GTX680 > GTX580 to be honest at high res or eyefinity were you would actually want to use SLI.
 
Global Foundries is really the only other option, and for whatever reasons, both NV and AMD don't see any benefit from switching to them (and don't even try to mention "well why can't they just use both at the same time!" Processes don't transfer from one foundry to the next).

Hmm that sucks. Well I'd hate to think what Intel could pull of using their next-gen fabs if they were a serious GPU competitor. (I still have hope that Intel will purchase NVIDIA one day)
 
(please note from everything I have seen both the 7970 and 680 are amazing cards just trying to point out that calling one high end and the other midrange and when they are both priced about the same and declaring the "midrange" card the "winner" just because it's "midrange" is rather silly. seriously guys lets put all the fan boy crap aside these are both great gpu's with similar performance at a similar price and if either company had some amazing ace up their sleeve that was 3 bajillion times better and could be had in exchange for a double cheeseburger and small fry they would be releasing it. )

Where in either of are our comments say anything solely supporting one company? Were speaking critically about both companies. Nothing fanboy about that, but that's seems to be the knee jerk insult around here.
By totally missing the point of how we got these cards, just looking on the surface at price and performance causes you to be mislead. The 680 starts to fall behind >1600p. Midrange.
The real story always comes out months later after everyone has spent their money. When they release the GK110, they admit the GTX 680 wasn't their highend card and people will buy again.
 
Only 2gb video ram what the fuck nvida?

where is a 3gb or more version?
 
Only 2gb video ram what the fuck nvida?

where is a 3gb or more version?

GTX680: soft Kepler

GK110 that will come in a few more months will be: "real" Kepler and will probably have 3gb-4gb.

The gk110 is only a rumor I think. Although I am really hopeful a "real" big Kepler will come soon.
 
To long to wait

Im not making the change and buying another ATI 7970

I need more video ram for maya and 3d studio max.
 
To long to wait

Im not making the change and buying another ATI 7970

I need more video ram for maya and 3d studio max.

If rendering is your primary requirement, why not just get a workhorse card?
 
I get 3313 marks in 3d Mark 11 -- and that's on my 6970+6950 crossfire setup I got for an insanely good price used.

Gaming at 1920x1080p -- looks i'm set for at least another year or so :) The next gen hardware offers me nothing but a lighter wallet. Sure you could make the power requirement argument, but realistically speaking the only way that would save me money is if I gamed 24/7 for a few years straight to offset the initial 500-550 dollar investment.

As a neutral party in this red vs green battle -- sucks to say but nvidia came way late to the party and didn't bring all the booze it promised. Sure it's faster, but 10-15% is nothing when you consider what the 7000 radeon chips have been doing after a bit of overclocking. Will keep an eye on what really comes down the pipe in the coming weeks... but like many others I'm not expecting much from the green team.
 
I think nothing can be said until we get an OC vs OC review since Nvidia would obviously have set their default clock with the 7970 in mind (We cannot blame them for that. It is common sense and one advantage of releasing the card after AMD).
So to sum it up:
Stock vs Stock............USELESS
Clock vs Clock............USELESS
Max stable OC vs Max stable OC (at stock Voltages)..........The only one that matters.
 
2gb high end card in this day and age? Slower in metro then 7970 at 2560? Barely faster at 2560 in other games while costing more? 4 months late to the game? And people are EXCITED about this?

I call this a failure.

AMD will poop all over this right away by making the 7970 cheaper and a better deal for the money and then release the 8970 1-2 months later and take the crown again.
 
I'll be getting one of these asap if they launch at $500. Also don't assume multi-monitor people are sticking with AMD. I left them last gen because of the stupid screen tearing and I'm staying away from them because they haven't bothered to fix it. I'll give up a few frames here and there any day of the week so that one of my monitors doesn't look like crap.

That's exactly why I left AMD. Couldn't stand one of the screens tearing (only solution was to use three DP monitors - no thanks). Nvidia didn't have this problem though I did need to buy two GTX 580s. I haven't bothered with AMD/Eyefinity since.
 
I think nothing can be said until we get an OC vs OC review since Nvidia would obviously have set their default clock with the 7970 in mind (We cannot blame them for that. It is common sense and one advantage of releasing the card after AMD).
So to sum it up:
Stock vs Stock............USELESS
Clock vs Clock............USELESS
Max stable OC vs Max stable OC (at stock Voltages)..........The only one that matters.

This man speaks the truth
 
It does make you wonder if Nvidia pushed their cards stock clocks higher to beat the 7970. If this is the case that would mean they have less overclocking headroom. I am curious to see how the 680 overclocks.
 
2gb high end card in this day and age? Slower in metro then 7970 at 2560? Barely faster at 2560 in other games while costing more? 4 months late to the game? And people are EXCITED about this?

I call this a failure.

AMD will poop all over this right away by making the 7970 cheaper and a better deal for the money and then release the 8970 1-2 months later and take the crown again.

Math lessons for everyone.....
9th Jan to 22 March = 4 months
499$ is more than 549$

But I agree with you. What were Nvidia fans getting excited about? So what if it's smaller, faster, cheaper, requires less power and beats 7970 in every game other than Metro. OMG It has 3 frames lesser than Metro. EPIC FAIL, I say.
 
I think nothing can be said until we get an OC vs OC review since Nvidia would obviously have set their default clock with the 7970 in mind (We cannot blame them for that. It is common sense and one advantage of releasing the card after AMD).
So to sum it up:
Stock vs Stock............USELESS
Clock vs Clock............USELESS
Max stable OC vs Max stable OC (at stock Voltages)..........The only one that matters.

Except every card is different, so there is no way to compare max stable overclocks. Is the max stable overclock the minimum that everyone gets? What if one card only gets 1150 but the others get 1300? How do you compare then?

The real comparison is stock vs stock, anything above that is a bonus.
 
Except every card is different, so there is no way to compare max stable overclocks. Is the max stable overclock the minimum that everyone gets? What if one card only gets 1150 but the others get 1300? How do you compare then?

The real comparison is stock vs stock, anything above that is a bonus.

While I do agree that max stable overclock is subjective I disagree that stock vs stock is the real comparison. The real comparison is max massively obtainable overclock. Nobody leaves their card at stock anymore unless they are already getting more performance than they need. If that is the case then they probably don't care anyway.
 
While I do agree that max stable overclock is subjective I disagree that stock vs stock is the real comparison. The real comparison is max massively obtainable overclock. Nobody leaves their card at stock anymore unless they are already getting more performance than they need. If that is the case then they probably don't care anyway.

Some 6950s unlocked to 6970s - does that mean the only real comparison is by using a card that is unlocked? There's just no fair way to draw conclusions based on something that is so hit or miss.
 
Except every card is different, so there is no way to compare max stable overclocks. Is the max stable overclock the minimum that everyone gets? What if one card only gets 1150 but the others get 1300? How do you compare then?

The real comparison is stock vs stock, anything above that is a bonus.

OK I agree. Was actually thinking of that while posting. But how many people paying 500$ would actually run these cards at stock. They always OC, some wait for a few months more than others but do eventually. Stock clocks are always too conservative.
How about we settle on
Avg Stable OC rather than Max stable OC.
Max stable OC gives a rough idea of the min stable OC the consumer can expect. For eg anyone buying a 7970 can expect 1150 atleast. Anything above THAT is a bonus.
 
Tom's asked overclockers.com to remove the images. As if he can round up all the cats that are now out of the bag.
 
Tom's asked overclockers.com to remove the images. As if he can round up all the cats that are now out of the bag.

LOL. But you feel sad for Tom. Probably in a lot of trouble due to 'accidental' breaking of NDA.
I really didn't understand how they were leaked 'accidentally'. They uploaded them on their Website, right there for the world to see. It's not like they were hacked or something.
 
Some 6950s unlocked to 6970s - does that mean the only real comparison is by using a card that is unlocked? There's just no fair way to draw conclusions based on something that is so hit or miss.

How so? I said massively obtainable overclock, which would mean it would not be hit or miss. It may not be perfectly accurate, but it is more accurate than stock vs stock that doesn't really give the user an idea of what the card is truly capable of. For example a very reasonable overclock for the 7970 would be 1100. Most if not everybody can hit at or near that overclock with no issues. You could teach a monkey how to overclock a 7970 that high. All it takes is dragging a few sliders. Why should a card not be evaluated at performances that are so easily obtainable?
 
Status
Not open for further replies.
Back
Top