Gtx 480 Unigine Video (and benchmark)

You are intitled to your opinon. And we listened... for a while. You've just become annoying now. You get off on 3D, we get it. If you'd like to have a circle jerk about 3D vision please go start a thread about it and discuss with your self to your hearts content.
So it is fine to discuss ATI cards and features on a Fermi thread but not 3D vision which IS a Fermi feature?
An who are you to tell me what I can post or not? If you dislike my comments then ignore me, you are not forced to read what I type you know. ;)
 
if it's better then it's better.
SLI or Crossfire IS needed at high resolutions for a select few games, if you want to use high settings/max settings. I agree though that at this point it's pretty pointless estimating performance prior to information other than PR stuff is released.

^This
 
Are ATI fanboys forgetting that the 5870 came AFTER the GTX295 and was beaten by it in many games? Yeah dual vs single, whatever, but the fact still remains.

Are NV fanboys forgetting that the GTX480 came AFTER the HD5970 and was beaten by it in many games? Yeah dual vs single, whatever, but the fact still remains.

:rolleyes:
 
So it is fine to discuss ATI cards and features on a Fermi thread but not 3D vision which IS a Fermi feature?
An who are you to tell me what I can post or not? If you dislike my comments then ignore me, you are not forced to read what I type you know. ;)

Well actually 3D isn't specific to Fermi. Other Nvidia cards have it, and ATI will be supporting it through 3rd party vendors.

But no, people are just getting sick and tired of you constantly going "3D is all the rage! 3D is super important! 3D 3D 3D! OMG 3D!!!!!". The 3D technology isn't the problem, YOU repeating the same thing over and over and over again is the problem.

And for what its worth, 3D has tried to break into the market for something like 20 *years* now. Nothing has changed, 3D still isn't going to go anywhere. Companies were showing it off at CES because Avatar was new and that got the public interested in 3D again - an interest that is already quickly disappearing. The simple fact of that matter is that people just don't like the stupid glasses. Be it shutter, polarized, or the red/blue thing, the public just doesn't like it. And that really isn't going to change.
 
Well actually 3D isn't specific to Fermi. Other Nvidia cards have it, and ATI will be supporting it through 3rd party vendors.

But no, people are just getting sick and tired of you constantly going "3D is all the rage! 3D is super important! 3D 3D 3D! OMG 3D!!!!!". The 3D technology isn't the problem, YOU repeating the same thing over and over and over again is the problem.

And for what its worth, 3D has tried to break into the market for something like 20 *years* now. Nothing has changed, 3D still isn't going to go anywhere. Companies were showing it off at CES because Avatar was new and that got the public interested in 3D again - an interest that is already quickly disappearing. The simple fact of that matter is that people just don't like the stupid glasses. Be it shutter, polarized, or the red/blue thing, the public just doesn't like it. And that really isn't going to change.

FWIW stereoscopy has been around a lot longer than 20 years. I rather enjoy it, though, and will more than gladly wear flickery shutter glasses on top of my regular glasses to get the effect, though I'm know I'm not in the majority there. I hope it catches on, but I'm not holding my breath nor am I willing to invest in new hardware until 3D has become widespread.
 
So it is fine to discuss ATI cards and features on a Fermi thread but not 3D vision which IS a Fermi feature?
An who are you to tell me what I can post or not? If you dislike my comments then ignore me, you are not forced to read what I type you know. ;)
3D vision is not a fermi exclusive feature. Should I go off and write 10,000 posts about Fermi having suppot for DX9?:rolleyes:

You're reading comprehension is lacking. I never once told you what you could not post, I recommended a course of action if you wanted to have a circle jerk about an exisiting feature.

You are correct, I don't have to read your crap. Welcome to my ignore list.
 
FWIW stereoscopy has been around a lot longer than 20 years. I rather enjoy it, though, and will more than gladly wear flickery shutter glasses on top of my regular glasses to get the effect, though I'm know I'm not in the majority there. I hope it catches on, but I'm not holding my breath nor am I willing to invest in new hardware until 3D has become widespread.

My main problem is the only monitors which support 3Dvision are 22" blegh but I might feel better if they're it's gonna work with 3 monitors. The idea of stereoscopic surround video sounds pretty awesome. Not to be mean to nvidia but Im kinda hoping I can just buy another GTX275 on the cheap and Sli that rather than upgrade.
 
My main problem is the only monitors which support 3Dvision are 22" blegh but I might feel better if they're it's gonna work with 3 monitors. The idea of stereoscopic surround video sounds pretty awesome. Not to be mean to nvidia but Im kinda hoping I can just buy another GTX275 on the cheap and Sli that rather than upgrade.

since i got my new dell 27" my other 22" dell screen just look old and busted next to the new hotness.

on a side note I saw avatar in 3d after seeing it in 2d, and the 3d is pretty epic, except for one thing. the last 30-45 minutes i had a pretty solid headache going, and lasted all night till I went to bed. I could not imagine doing a massive gaming session in 3d without my head exploding.
 
Who gives a deuce about 3D vision? These cherrypicked rigged benchmarks in 1-2 minute tests should be embarrasing to nvidia. I'd say the majority of us would rather play games than watch benchmarks all day.
 
I can certainly see why people are complaining about the canned benchmark nature of this. I would definitely like to see something else such as AVP, Dirt 2, or even something older like Crysis. That said, it is still a decent idea of what's going on, Nvidia spin or no. At least it's not as bad as the out of context benchmark from earlier (which was also posted in this thread somewhere) that was solely the tessellation. It seems slightly more modest.

As for 3dvision, my view on it is that I personally find it rather awesome. Even the red/cyan glasses can give an impressive effect. If I were to go either Eyefinity or 3dvision, I would choose the latter. That said it has two problems that will cause it to always have a low adoption rate.

For one, it's too expensive. Secondly, some people could get severe eyestrain/headaches. Heck, some people might not even be able to see it at all, for instance I have a friend who's blind in one eye and can't get a stereo effect at all.
 
http://tech.icrontic.com/news/nvidias-first-official-radeon-5870-vs-gtx-480-benchmark/

Excerpt:

"It must also be noted that the Fermi architecture behind the GTX 480 is particularly suited to tessellation work, which the Heaven benchmark uses in abundance. This is because Fermi is capable of roping any number of its 16 stream multiprocessors (SMs) into tessellation work, which can lead to dramatic performance advantages when tessellation is the focus. This design differs from the Radeon HD 5800 series which has a dedicated tessellator unit, or a fixed-size tessellation engine that offers a specific and consistent tessellation performance.

The 'issue' with NVIDIA’s design, however, is that those SMs are likely to be busy with other tasks when it comes time to render a game. This means that tessellation-focused benchmarks like Heaven can return sky-high figures for Fermi-based cards, as the GPU is not spending the majority of its time doing the mundane rendering work it would do in a game."

and

"So, if you’ve ever wondered why NVIDIA has been preternaturally obsessed with the Heaven benchmark, the short answer is simple: it uniquely leverages the Fermi architecture in a way games would not to return benchmark results that shame ATI scores."

Hmm. Unless Fermi kills Cypress in real-game situations' minimum framerates and not just in synthetic benchmarks, I'm gonna keep my 5850 for a long time, although I do tip my hat to NVIDIA for its more elegant architecture relative to ATI's relatively inelegant fixed tessellator.

A lot of Fermi's problems have to do with yields and not accounting for TSMC's flawed 40nm process, leading to all sorts of power/heat/price problems. I suspect NVIDIA may be competitive again with a Fermi refresh at 28nm, especially if ATI messes up with Northern Islands. And competition is good for consumers!
 
I have a question, why was the Anisotropy set at 1 instead of 4 like every other benchmark I've seen done for the 5870?
 
I have a question, why was the Anisotropy set at 1 instead of 4 like every other benchmark I've seen done for the 5870?

Probably to further exaggerate Fermi's performance. Turn up AA and AF on DX11 @ 1920x1200 and the GTX 470 loses handily to the 5870 and is on par with the 5850. Not sure about GTX 480 since I haven't seen leaked benches of that anywhere credible.
 
I have a question, why was the Anisotropy set at 1 instead of 4 like every other benchmark I've seen done for the 5870?

like said above its (gtx480) to show off its ability to crunch numbers. I hate to say it but fermi just is not a gaming card. its a GPGPU card. This seems to confirm what others have been saying for a while. its the only reason I can think of to do this.
 
Correct me if I'm wrong, but that demo wasn't run with AA turned up, probably because of this:

http://www.hardforum.com/showpost.php?p=1035410551&postcount=83

Seems as if a GTX 470 uses up some of its computing power for tessellation which impacts its AA, so that high levels of AA it starts losing to 5870's, at least on averages. As for minimum fps, which I think are more important, it's unclear how much of that edge Fermi retains when AA is turned on. GTX 480 may be fast enough to edge out 5870, but at what price?

You're wrong :p Trace the source of the rumour and you'll see why. I see you've added your own bit to it now too though. Running tessellation on the shaders impacts AA....sweet. Soon, we'll be hearing that Fermi has no texture units and runs that on the shader core too.

Fermi has 16 tessellator units, one per SM. And they are not part of the shader core. Now we have moronic websites propogating the same rumour. Don't people read any more?
 
People are allways pissed about scalpers. But they are mad at scalpers, not at the company selling the cards. 4870x2s were scalped for a while but people kept praising ATI. Nvidia wouldn't care as long as it can say it has a faster cheaper card even if it doesn't have any on the shelves.
That sounds like "To reward our royal supporters who cannot afford high prices, we will price our cards low (and lose more money in the process) so that our royal supporters will still be unable to afford those cards." Remember, market prices are set by supply and demand, not MSRP.

Taking $200 off (as an example) 5000 cards adds up to $1,000,000 in lost profits (or in additional losses). That is a hell of a price to pay for a silly talking point that most people will see through in 5 seconds.
 
although I do tip my hat to NVIDIA for its more elegant architecture relative to ATI's relatively inelegant fixed tessellator.

I don't think its necessarily more elegant, just different. ATI is using dedicated hardware, and it sounds like Nvidia is using a mix of dedicated hardware and pseudo-software emulation. Both have their pros and cons, I don't think either one is necessarily better. One of the two's implementation may very well end up being better, but from a higher level design perspective I don't think either one has a clear advantage.

Fermi has 16 tessellator units, one per SM. And they are not part of the shader core. Now we have moronic websites propogating the same rumour. Don't people read any more?

Source? I haven't read much on Fermi's architecture, but from the marketing diagrams the tessellator units were always grouped with other things and not pictured as standalone.
 
You're wrong :p Trace the source of the rumour and you'll see why. I see you've added your own bit to it now too though. Running tessellation on the shaders impacts AA....sweet. Soon, we'll be hearing that Fermi has no texture units and runs that on the shader core too.

Fermi has 16 tessellator units, one per SM. And they are not part of the shader core. Now we have moronic websites propogating the same rumour. Don't people read any more?

I am only guessing as to why. Maybe it's a driver issue.

Do you have a better explanation as to why the GTX 470 goes from beating the 5870 by 7.4% to losing by 15%, when going from 4x to 8x AA?

Edited to add: I don't see why what you said invalidates what Charlie said (I think that's what you mean by the source of the "rumour" which is plausible since Icrontic used the same term about shaders getting "roped in" to helping with tessellation). It seems that maybe both of you are right. Just because Fermi has dedicated tessellators does not mean that shaders can't be roped into helping them out, and that such roping is what causes such a huge boost to Unigine tessellation results. Have you considered this possibility?
 
Last edited:
All this talk and comparing a card that hasn't been released yet is just crazy....

Nvidia is trying their hardest to beat a card that has been out 6 months before the release of Fermi? does anybody see this? A 5870 should not be compared with the fermi at all its 6 months older, nvidia had 6 months to make SURE this card beats whats out...

6 months is a long time in the computer industry... Now when fermi is released... and ATI comes back at it with the 5890 or even 6 series which they would have had the time to make sure it beats fermi...just seems like a never ending round of bullshit and 10FPS faster everytime around the horn....



I seriously hope Fermi is great to lower prices...
 
Who gives a deuce about 3D vision? These cherrypicked rigged benchmarks in 1-2 minute tests should be embarrasing to nvidia. I'd say the majority of us would rather play games than watch benchmarks all day.

true, lets bench dx8 games instead so we can choose which video card to get.

+1
 
true, lets bench dx8 games instead so we can choose which video card to get.

+1

so we should bench unigine to find out what video card to get? Maybe we can throw 3dmark06 and furmark in there as well?
 
so we should bench unigine to find out what video card to get? Maybe we can throw 3dmark06 and furmark in there as well?

dx11 performance is all that matters really and it will be the main focus when Fermi is released.
 
but it's not a game. where are the dirt2 and avp benchmarks? are you going to stop posting if the GTX 480 is within 5% of the 5870 in games, like everyone claims? Are you going to stop posting when the GTX 470 is beaten by the HD5870 despite costing $100 more?
 
Or we could bench it in actual games that you can play, like, with a computer. Thanks for the stupid reply though.

and when 480 comes out and destroys 5870 in actual games you're going to run to prices. This is why i dont even bother talking about the actual games, i know where you will go after that.
 
I'd much rather play a DX8 game then watch a DX11 benchmark run

Somebody get this boy a trackball he just warped into old school.

I was forced to watch someone play modern warfare 2 the other day. Then I went home and played counter-strike. DX8 you elate me..
 
Somebody get this boy a trackball he just warped into old school.

I was forced to watch someone play modern warfare 2 the other day. Then I went home and played counter-strike. DX8 you elate me..

his comment made no sense
there are plenty of dx9/10 and now dx11 games to play 0.0
I was just feeding the troll or viral marketer
 
and when 480 comes out and destroys 5870 in actual games you're going to run to prices. This is why i dont even bother talking about the actual games, i know where you will go after that.

Eh? But I want the 480 to win, I don't really even care about prices, don't think I've ever mentioned them. I just see you accepting marketing slides and discounting negative reports without offering any actual numbers or information that isn't already available. Canned benchmarks offer very little towards learning how a card will really perform in real world situations. You seem to be assuming that future games are just going to tesselate everything and forget everything else.

I would hate to see nvidia launch a lackluster product, and I've never implied otherwise. Who in the tech world wants to see new technology come out that only competes with old technology? That doesn't sound very [H]ard.
 
Unimpressive video.

If they were to show the benchmarks being run at exactly the same settings for the Fermi and 5870 with the results being compiled real time then that would be preferrable. But, nVidia's marketing department won't allow it until the 'perfect time', whenever that is.
 
but it's not a game. where are the dirt2 and avp benchmarks? are you going to stop posting if the GTX 480 is within 5% of the 5870 in games, like everyone claims? Are you going to stop posting when the GTX 470 is beaten by the HD5870 despite costing $100 more?

LOL... wait for the game benchmarks like the rational, level-headed people here :cool:
 
If those chinese 470 benches are remotely correct then Heaven is benched and shown off for a good reason. Nvidia is a software company after all.
 
If those chinese 470 benches are remotely correct then Heaven is benched and shown off for a good reason. Nvidia is a software company after all.

I'm going to have to agree with this. Nvidia has a _LOT_ of pull in the game development industry. Any title that is "the way its meant to be played" will surely have moderate to heavy tessellation if they are going to have their way. Time will tell.
 
Well actually 3D isn't specific to Fermi. Other Nvidia cards have it, and ATI will be supporting it through 3rd party vendors.

But no, people are just getting sick and tired of you constantly going "3D is all the rage! 3D is super important! 3D 3D 3D! OMG 3D!!!!!". The 3D technology isn't the problem, YOU repeating the same thing over and over and over again is the problem.

And for what its worth, 3D has tried to break into the market for something like 20 *years* now. Nothing has changed, 3D still isn't going to go anywhere. Companies were showing it off at CES because Avatar was new and that got the public interested in 3D again - an interest that is already quickly disappearing. The simple fact of that matter is that people just don't like the stupid glasses. Be it shutter, polarized, or the red/blue thing, the public just doesn't like it. And that really isn't going to change.

I agree with the statements made about his incessant Nvidia 3D vision as it is hardly even a secondary factor in today's PC gaming segment. 3D vision and Physx, as of now, are very gimmicky (on 2 major titles utilize physx, a few games utilize 3d) as the support for them is extremely slim (on 2 major titles utilize physx, a few games utilize 3d) . Hopefully that will change one day, but neither of those compare to the wide adaptation that Eyefinity which has support for more titles than 3D vision and Physx combined (this is not subjective, no matter how much you believe).

However, I think you will be surprised by the 3D trend which is only gaining popularity. Blurays in the later portion of this year will be available in 3D when the 3D Bluray players hit the market. I think that we are on the verge of a 3D trend as 3D gains popularity with the public as the reception from the public has been possitive.
 
3D was certainly big at CES, but at the moment it's pretty much a gimmick. If 3D becomes important you'll have transitioned rigs enough times to forget what fermi was.

It is a great way to re-sell a bunch of HDTV's if they can convince people.
 
I just found out late that STALKER Call of Pripyat has been out for a month and I missed the release and ordered it just now (probably a good thing since the update patches and mods are out now anyway). Now I'm in the market for another video card upgrade to go to something faster than my discontinued EVGA nVidia GTX 260 (192sp) 896 GB.

I've been following the video card wars with the release of the ATI 5700/5800 series and the upcoming March 26, 2010 release of the nVidia GTX400 series only to find out through the grapevine that the GTX470 will be priced prohibitively high $130++ against the HIS ATI 5870 1GB at $379 USD. On top of this I read all these rumors that that the nVidia GTX400 series will not be price/performance competitive against the ATI 5850/5870. :mad:

I'm going to wait into April to see what happens with the prices and the new product launch to see if there's any hope on the horizon for nVidia and the Fermi architecture. I'll still be able to enjoy the only game series that keeps my interest in games overall on my current video card but I'm having this slight feeling that my next video card upgrade might be a switch to ATI from nVidia. I'm not an nVidia fan boy but I've had a good run with their cards all the way from GeForce 2 series without having to consider ATI, until now. :confused:

Anytime ATI was leading the pack in performance nVidia just came out with a new series that stole the thunder at exactly the time that I needed to upgrade for a new STALKER game release and I went through the 8800 GTS 640MB & 512MB and now GTX 260 at those times.
 
However, I think you will be surprised by the 3D trend which is only gaining popularity. Blurays in the later portion of this year will be available in 3D when the 3D Bluray players hit the market. I think that we are on the verge of a 3D trend as 3D gains popularity with the public as the reception from the public has been possitive.

The whole "3D" thing is nothing more than simulating how the human eyes see things(the objects are at slightly different distance to each eye, gives some sense of depth). The difference between "real 3D" and the traditional 3d gaming/videos " should be nothing more than the difference between seeing with two eyes and with one eye closed.
 
My main problem is the only monitors which support 3Dvision are 22" blegh but I might feel better if they're it's gonna work with 3 monitors. The idea of stereoscopic surround video sounds pretty awesome. Not to be mean to nvidia but Im kinda hoping I can just buy another GTX275 on the cheap and Sli that rather than upgrade.

Actually no... there are 24" 120hz panels from Alienware and Acer now. They're 1080p (as every 24" panel is nowadays).
 
Back
Top