I think most of the stories about 4x killing the performance of xfire was when the pcie 1.x standard was out. I'm pretty sure that pcie 2.0@4x = pcie 1.x@8x, which is pretty acceptable, at least thats how i understood it.
When i had xfire 4890 there was no benchmark diff between the 2nd card...
That minecraft developer sounds like a bit of an asshat, he goes on to list all possible flaws with the way the tech is used and all it's limitations - goes on to even call it a "scam" and then and the end says get excited about voxels.
It sounds like he's more upset with the way this...
Maybe Ati has become more profitable now, but they went through crazy bad times for the first few years after purchase as a direct result of the money lost/unavailable because of it.
1) Amd's fastest current desktop processor (Phenom II) is based directly off around 10yr old single core athlon tech.
2) Amd's total R&D funding per year has been (averagely) only 1/6th of Intel's R&D costs.
3) After purchasing Ati graphics for an insane $5.4 billion in 2006...
Based on your sig, overclock the crap out of that c2d and watch your minimum fps skyrocket - from 3ghz to 3.6/3.8 would make it seem like you upgraded your gpu.
The first step towards optimus prime, go go moving parts :)
I would love my vid card to change shape for each diff game... im sure when trying to run metro max triple screen my vid card's would form the shape "hell no".
Just a heads up, 4800 x 900 res on 1080p screens looks really bad. Try running singular 1680 x 1050 res on a 1080 monitor and it looks worse than that over 3.
It doesn't matter if you have a 650 or 750 as the amps on the 12v is what really counts for high end vid cards, i have a corsair 650 as well and full load with crossfire 6850's and PII X6 @ 3.9ghz wasn't near 80% usage.
I bet you it will run 2 x 6970's fine as when i bought it (modular...
When i had my laptop i looked into some of the options available and they were all pcie-x 1 or 2x bandwidth and as it seems they still are. What's the point of hooking any current or even last gen card to a laptop when it will run at an eighth of the performance of what it should do.
Maybe...
It's all relative to the capability of the hsf, i would think a TRUE would leave the air cooler than a stock hsf as it dissipate's the heat quicker. Therefore there is less heat at any one point of the aftermarket hsf than a stock hsf.
Sure there is more surface area to spread the "warm air"...
Easier option would have been to uninstall nvidia driver, use driver sweeper, shutdown and remove sli bridge, restart and install old drivers put sli bridge back on and then bam.
In africa a little kid waits and wonders how he will find any food for dinner, but right here we just have Cowboy420 who doesn't understand what the [H] in [H]ard|ocp means... a long time user and still no appreciation/understanding of what most users actually come here for.
I'll give you a...