Intel Compares Arc A750 with RTX 3060 With Latest Driver Update

erek

[H]F Junkie
Joined
Dec 19, 2005
Messages
11,099
This is actually commendable and legit imo

“According to Intel's own slides, the latest driver update brought decent performance improvements in several games, but more importantly, it is enough to give the Intel Arc A750 an edge, pushing it ahead of the RTX 3060 12 GB graphics card, at least in Dead Space Remake game. Although Intel has listed a bit of a higher price for the RTX 3060 12 GB, Arc A750 still offers higher performance per dollar, at least in some games. Of course, NVIDIA always has the RT and DLSS aces up its sleeve.

The Intel Arc A750 was pretty high on the performance per dollar in our review back in October last year, even when it was priced at $290, and with the recent driver updates, it is an even better choice.”

1681579212678.png

Source: https://www.techpowerup.com/307325/...h-rtx-3060-with-latest-driver-update#comments
 
This is actually commendable and legit imo

“According to Intel's own slides, the latest driver update brought decent performance improvements in several games, but more importantly, it is enough to give the Intel Arc A750 an edge, pushing it ahead of the RTX 3060 12 GB graphics card, at least in Dead Space Remake game. Although Intel has listed a bit of a higher price for the RTX 3060 12 GB, Arc A750 still offers higher performance per dollar, at least in some games. Of course, NVIDIA always has the RT and DLSS aces up its sleeve.

The Intel Arc A750 was pretty high on the performance per dollar in our review back in October last year, even when it was priced at $290, and with the recent driver updates, it is an even better choice.”

View attachment 564470
Source: https://www.techpowerup.com/307325/...h-rtx-3060-with-latest-driver-update#comments
Intel is coming to play and I am all for it
 
I am really liking what they are doing with drivers after the original shit show, which was expected given their first go at dedicated GPU and its what came after that I really give em kudos for. I would not hesitate to grab an A770 go play around with in one of my rigs. Really looks for Battleimage GPUs when they launch.
 
Not bad, but then again the A770 is basically the same price as the 3060 isn't it?
It's still a good $50 cheaper (in Canadian clown money) but yeah pretty close, but given memory and some of the upcoming XeSS stuff I'd take a chance on it given the opportunity. The extra memory does give some headroom to deal with shittier port jobs which is appealing.
 
It's still a good $50 cheaper (in Canadian clown money) but yeah pretty close, but given memory and some of the upcoming XeSS stuff I'd take a chance on it given the opportunity. The extra memory does give some headroom to deal with shittier port jobs which is appealing.
Plus (which is probably irrelevant to most people) it runs at 1:4 speed double precision , so you can get close to 5 TFlops FP64 in a $350 card.
 
According to Intel's own slides
^^THIS^^ is always the catch.... any company making any product can make whatever claims they wanna in their own, self-serving marketing fluffiness....

When we see some 3rd party tests/reviews that confirm these claims (or shoot them full of holes), then perhaps we can believe them. Until then, it's all just a bunch of hooey gooey hawhaw :D

Also, bear in mind that these are relatively "NEW" gpu's, coming out after soooo many years of "will they, won't they" development, and they are only now catching up/comparable to cards that are over 2 years old....
 
Last edited:
^^THIS^^ is always the catch.... any company making any product can make whatever claims they wanna in their own, self-serving marketing fluffiness....

When we see some 3rd party tests/reviews that confirm these claims (or shoot them full of holes), then perhaps we can believe them. Until then, it's all just a bunch of hooey gooey hawhaw :D

Also, bear in mind that these are relatively "NEW" gpu's, coming out after soooo many years of "will they, won't they" development, and they are only now catching up/comparable to cards that are over 2 years old....
In all fairness both AMD and Nvidia are charging $600+ for cards that only match performance of cards from 2-4 years ago?
Both AMD and Nvidia have abandoned the sub $500 market to let that be filled with last years stack because progress from both of them have stalled.
They can only change the architecture so much before they break performance for older titles and die shrinks aren’t getting them anywhere close to the gains they once were and those shrinks are coming at a higher and higher price tag.
NVidia is looking to software to fix the hardware stagnation and AMD is trying to ignore it because it really isn’t their problem.
Intel is playing very hard in the only market they can, nobody is going to spend $600 or more on an Intel card with an uncertain future with young drivers and zero track record. Intels brand name brings them absolutely nothing to the GPU market so they have to play like a young plucky upstart here, because they essentially are.
 
Back
Top