Not Ray Tracing at all it is AI-assisted image generation and interpolation.Optical Flow is not Ray Tracing. That's all people need to know to pick the claims apart.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Not Ray Tracing at all it is AI-assisted image generation and interpolation.Optical Flow is not Ray Tracing. That's all people need to know to pick the claims apart.
This is stated has being trivial and maybe I am a bit slow to not understand, but how so ?It does not advance Ray Tracing in any way, rather detracts quite destructively from it.
Also not yet reviewed and dissected for quality. So is Nvidia is at a stage where the created frame using AI replaces a whole rendered RT frame where each pixel had computation from shaders, RT cores? A whole new way of making frames just as good as a rendered frame? If not, will we or I be able to tell the difference, not get a migraine etc?Not Ray Tracing at all it is AI-assisted image generation and interpolation.
When you quote me like this, it would help to replace the demonstrative pronouns with what they're refering to. The frame generation in DLSS 3.0 is an interpolation (blend) of two 2D images using direction information for each pixel. This is all done outside of the game engine. True ray-tracing happens inside the game engine, because it needs access to all of the game's scene assets (lights, geometries, materials, etc.)This is stated has being trivial and maybe I am a bit slow to not understand, but how so ?
Not from my understanding, it is an extrapolation of the couple of last image with motion vector (of things including light).The frame generation in DLSS 3.0 is an interpolation (blend) of two 2D images using direction information for each pixel.
Yes I think I get what you mean, but nothing raytracing wise on the actual regular frame was hurt by this no ? How DLSS 3.0 degrading RTX ?True ray-tracing happens inside the game engine, because it needs access to all of the game's scene assets
NVidia's own words were "intermediate frame" meaning interpolation, even though their slides suggest extrapolation. However, input lag has be negated with Reflex, so it must be interpolation. The vagueness cannot be coincidence.Not from my understanding, it is an extrapolation of the couple of last image with motion vector (of things including light).
Yes I think I get what you mean, but nothing raytracing wise on the actual regular frame was hurt by this no ? How DLSS 3.0 degrading RTX ?
You know that this technology has been around and available since 2018? It’s part of the original DLSS was improved in DLSS 2.0, and improved further in DLSS 3.0 which now uses specific hardware for it. It’s not new nor unproven, what it is now is far easier to actually implement from a development perspective and as such getting rolled into Unreal 4, 5, and Unity.Also not yet reviewed and dissected for quality. So is Nvidia is at a stage where the created frame using AI replaces a whole rendered RT frame where each pixel had computation from shaders, RT cores? A whole new way of making frames just as good as a rendered frame? If not, will we or I be able to tell the difference, not get a migraine etc?
The simulation and RT aspect for future games is intriguing but unless Nvidia dishes out the content, with most AAA titles are designed around Console, makes this capability mostly unusable except for some developers.
One metaphor I got from Nvidia presentation is entanglement, Nvidia SDKs, tools , proprietary tech, hardware and so on. Do you or as a company really want to be partnered or dictated to by Nvidia? Reliant for Nvidia to fix their SDKs, progress on tools and expand coverage and keep up to date on progress made in medical, scientific endeavors, weather forecasting, drug R and D and so on? Restriced to Nvidia hardware with possible bans, lack of support if company has severe issues, want to go beyond Nvidia, source code to totally customize and use other solutions?
Waiting on real quality reviews, users feedback, AMD launch. Just more interested in a quality monitor. I usually get at least one GPU per generation, normally 2 or more. Have not skipped in over 2 decades, so far nothing here from Nvidia I need.
The fact that there will no added input lag must mean extrapolation by definition and their slides seem somewhat clear like you said. The frame will be between 2 frame either way and would be intermediate frame either way. Not adding input lag require intelligence (there would be a risk for a real frame to be slowed down by a made up frame), which i imagine reflex help to handle.NVidia's own words were "intermediate frame" meaning interpolation, even though their slides suggest extrapolation. However, input lag has be negated with Reflex, so it must be interpolation. The vagueness cannot be coincidence.
Like DLSS 2.0, it is not that it increase in particularly RT performance, it is the ability to have good 4k quality at an actual lower resolution make playing with RT possible on a 4K screen, the same here, RT being real heavy specially with path tracing they started to support, it could make you go under your monitor VRR minimal frame from time to time, DLSS 3.0 will avoid that making it possible to use RT comfortably. At least from my understanding.The implication that DLSS 3.0 increases RT performance is absurd. Plus, you're paying for something that doesn't advance ray tracing, that should have gone into real RT hardware.
Yet here you are posting lengthy ignorant diatribes which appear to be primarily based on your personal views of consumer responsibility.LOL I am not indignant at all, you should be talking to people with all the outrage.
Somewhat ironic that you claim product information is so easily available for consumers but yet you are incapable of doing 2 minutes worth of googling yourself.And I would love for you to quote the regulatory and legal systems that Nvidia are breaking.
How were they marketed? Where all the specs prominently displayed on the boxes? What other representations were included in the marketing material that might have made it possible to distinguish the products?How come they weren't fined or whatever for the 3080 12GB and the 3080 10GB or go back further to the 1060 6GB and 1060 3GB? The difference between those cards wasn't just VRAM. Different number of Cuda cores and memory bandwidth.
Or how come neither AMD or Nvidia have never been fined for rebranding old cards and releasing them as new cards.
Categorically irrelevant - this is not an emotive statement based on personal views, there is plenty of case law where this exact point has been decided against marketers/sellers.The specs of both GPUs are on the site. You don't have to leave the shop or anything these days, just check on your phone.
See above. It is quite apparent that this is based on your personal views, in the real world consumers are entitled to assume truth in advertising. They are not obliged to do any research if they choose not to.And it's not about been snooty as you call it. If you don't have a clue about what you are buying, you should ask advice or do some research. If someone asked you about a new card coming out, would you tell them to buy it based on the Marketing blurb from the Manufacturer or would you tell them to wait for reviews?
No one might give a damn if the price is right, but the naming scheme would still technically be a problem. The main difference is that there would be less people inclined to lodge complaints if they feel that, notwithstanding that they thought they were getting a 4080 16gb with less memory, they can live with that because it was so much cheaper.The naming scheme isn't the problem, the price is. I bet nobody would give a damn about naming schemes if the 4080 12GB was $399 and the 16GB was $499.
That's the entire crux of the issue, the fact that they have changed the name to create the impression that the price hike is justified. The price they charge is a matter for them, but using marketing and naming to manipulate consumer sentiment and understanding of the product exposes them to legal issues.Serious question, do you think there would have been more or less outrage if they called the card the 4070? Almost a doubling of price from the 3070. The reaction would probably have been worse.
The naming scheme isn't the problem, the price is. I bet nobody would give a damn about naming schemes if the 4080 12GB was $399 and the 16GB was $499.
Serious question, do you think there would have been more or less outrage if they called the card the 4070? Almost a doubling of price from the 3070. The reaction would probably have been worse.
I think one of us, and for all I know, perhaps me, are not getting noko´s point here. DLSS and other upscaling methods have never come without drawbacks. Thats the only proven thing. Otherwise, it has been more about "good enough to use". For this, its purely subjective and for many Nvidia and AMD users, FSR is good enough and in cases where there is no upscaling support, FSR can be enabled and give good enough (like Steamdeck, where you can have global FSR). DLSS 3 is for the moment unproven and untested in the wild. It might be great, giving more FPS then previous DLSS, while still keeping it "good enough". Remains to be seen.You know that this technology has been around and available since 2018? It’s part of the original DLSS was improved in DLSS 2.0, and improved further in DLSS 3.0 which now uses specific hardware for it. It’s not new nor unproven, what it is now is far easier to actually implement from a development perspective and as such getting rolled into Unreal 4, 5, and Unity.
I want AMD to come out strong because this is bullshit on the pricing and naming needs something to stand up against it and it's certainly not Intel this time around. But I fear AMD is just gonna ride the gravy train to the bank by being just as expensive.I think one of us, and for all I know, perhaps me, are not getting noko´s point here. DLSS and other upscaling methods have never come without drawbacks. Thats the only proven thing. Otherwise, it has been more about "good enough to use". For this, its purely subjective and for many Nvidia and AMD users, FSR is good enough and in cases where there is no upscaling support, FSR can be enabled and give good enough (like Steamdeck, where you can have global FSR). DLSS 3 is for the moment unproven and untested in the wild. It might be great, giving more FPS then previous DLSS, while still keeping it "good enough". Remains to be seen.
Consoles are the lowest common denominator and games are often developed with whats "good enough" in mind on consoles. Sites like Hardware Unboxed and Digital Foundry have their videos on whats "good enough", "do you need Ultra" etc.
Raytracing is a great thing for realism, but haven´t been used in a meaningful way IMO in games to date. This might change in the future, but for now, I personally consider it a tacked on feature and a checkmark box.
DLSS is a great thing. It gives higher framerate for those that find it "good enough". But, its Nvidia only and though Nvidia have a huge markedshare and can get developers to implement it in their games, they are up against both Intel and AMD this time around. I mean, DLSS 3 is only supported by Nvidias 4XXX series, while Intel XeSS is supported even by Nvidia Pascal/1XXX series (same goes for FSR). FSR is even supported outside of games that have no ingame support. For all we know, DLSS is dead in a few years, sharing the grave of Nvidias GPU accelerated PhysX with the headstone "didn´t play nice with others, died alone".
I am more curious about rasterization performance and noise levels of the RTX 4XXX series. For DLSS and Raytracing, its more how well it works in VR and Nvidia have a good track record supporting VR, but that would only be the odd game now and then.
I also wait for reviews and AMDs offering.
Let's be honest. Even if they do, so many people are quick to sling out every Nvidia marketing line possible as to why they can't even consider an AMD card.I want AMD to come out strong because this is bullshit on the pricing and naming needs something to stand up against it and it's certainly not Intel this time around. But I fear AMD is just gonna ride the gravy train to the bank by being just as expensive.
Lets cross our fingers and hope AMD doesn´t get too greedy this time! With their chiplet design, they might be able to undercut Nvidia in price and still make a decent buck instead of pocketing more.I want AMD to come out strong because this is bullshit on the pricing and naming needs something to stand up against it and it's certainly not Intel this time around. But I fear AMD is just gonna ride the gravy train to the bank by being just as expensive.
They are still kicking my dog on a number of my older drafting workstations but yeah.Let's be honest. Even if they do, so many people are quick to sling out every Nvidia marketing line possible as to why they can't even consider an AMD card.
You'd think going based off online comment sections and forums that almost every home PC builder is a big time streamer who also wants to do some machine learning on the side. Also they only play DLSS/raytraced games.
Oh, and AMD drivers personally kicked their dog sometime in the past 10 years. Unforgivable.
leather pant suits?Too much hopes and dreams. Lisa Su is cut from the same cloth as Jensen Huang.
Every corporation operates under same principles. Maximize growth and profit for shareholders so you can attract more capital and grow. Those who do not deliver profits will not attract capital. There is no good or bad - these are realities of capitalistic system.Too much hopes and dreams. Lisa Su is cut from the same cloth as Jensen Huang.
Every corporation operates under same principles. Maximize growth and profit for shareholders so you can attract more capital and grow. Those who do not deliver profits will not attract capital. There is no good or bad - these are realities of capitalistic system.
And for those who think this is evil, in a speech on Nov. 11, 1947, Sir Winston Churchill reminded the UK’s House of Commons that “democracy is the worst form of government, except for all those others that have been tried.” In a similar fashion, capitalism is the worst economic system, except for all the others.
So it seems people are most upset about price @ 1600 for a flagship card. I am in Vegas this week for a conference where 2 people can easily spend $1600 in a single night for a night out at a good restoraunt and a show. Now that's a collosal waste of money. $1600 for a top GPU that will last 4 years equates to less than $40 per month + resale value. Would I love to pay $800 for a top end GPU, yes ofcourse. But have you all noticed inflation? I would love to pay half the price for gas like I did 2 years ago or get my steak at half the price like I did two years ago. Inflation sucks, so do new expensive nodes so do supply chain issues. But I thought this is a place where people are excited about tech. Like I would shit my pants back in 1990s when I had my first voodoo card if someone told me we would have this level of real time ray tracing or if someone explained to me tech behind DLSS or if someone showed me graphics in Cyberpunk in real time. We came a long way, and the road ahead is exciting as hell. I am just super happy to have lived through computer history from C64 to now to whatever will come over next few decades.
https://www.nvidia.com/en-us/geforce/news/rtx-remix-announcement/NVIDIA RTX Remix requires a GeForce RTX GPU to create RTX Mods, while mods built using Remix should be compatible with any hardware that can run Vulkan ray-traced games.
Hold out for the 5000 cards.I'm going to make it my life's goal to remix the original Unreal in RTX.
Deep Learning Super Sampling isn’t Ray tracing. DLSS is separate and independent of the DX12 Ray tracing libraries and doesn’t do any ray tracing. It exists solely as a tool for various image manipulation and management systems.When you quote me like this, it would help to replace the demonstrative pronouns with what they're refering to. The frame generation in DLSS 3.0 is an interpolation (blend) of two 2D images using direction information for each pixel. This is all done outside of the game engine. True ray-tracing happens inside the game engine, because it needs access to all of the game's scene assets (lights, geometries, materials, etc.)
You can't say that for certain. Morrowind ran on DX8 and it lacked a lot of the light rendering techniques we take for granted these days like light proliferation. The video does have issues similar to the Quake II RTX mod, though. The candlelight in the video is definitely too intense. Whomever was responsible for the direction of the mod in the video has probably never been in a room that was solely lighted by candlelight.This is butchering of the intended look and atmosphere.
Now tech media is starting to call Nvidia out....
https://www.digitaltrends.com/computing/why-the-rtx-4080-12gb-feels-like-rebranded-rtx-4070/
https://www.pcgamer.com/nvidia-rtx-40-series-let-down/
https://www.windowscentral.com/hard...ad-and-making-memes-of-nvidia-for-good-reason
I see some redditors are also grumbling that the 192 bit makes the 4080 12gb more akin to a 4060 based on previous gen naming and spec conventions. Is that true? (Edit: even the PCGamer article is suggesting this, and that it could be more accurately described as a 4060 Ti).
Good. Unless the 4080 12GB is a quantum leap forward in terms of performance, it's not going to measure up based on price. I'm glad everyone else is now calling them out.Now tech media is starting to call Nvidia out....
https://www.digitaltrends.com/computing/why-the-rtx-4080-12gb-feels-like-rebranded-rtx-4070/
https://www.pcgamer.com/nvidia-rtx-40-series-let-down/
https://www.windowscentral.com/hard...ad-and-making-memes-of-nvidia-for-good-reason
I see some redditors are also grumbling that the 192 bit makes the 4080 12gb more akin to a 4060 based on previous gen naming and spec conventions. Is that true? (Edit: even the PCGamer article is suggesting this, and that it could be more accurately described as a 4060 Ti).
"they are making". As if there's a gun to anyone's head.Nvidia's f'd up business model is that the high end card represents the best value. Instead of letting the overachievers pay 50% more for the last 10% of performance, they are making the middle class gamers pay $400 more for no good reason..
"they are making". As if there's a gun to anyone's head.
This may seem like as a crazy concept but if I'm not interested in a product I just don't buy it.
Your post.
That's the entire crux of the issue, the fact that they have changed the name to create the impression that the price hike is justified. The price they charge is a matter for them, but using marketing and naming to manipulate consumer sentiment and understanding of the product exposes them to legal issues.
See above. It is quite apparent that this is based on your personal views, in the real world consumers are entitled to assume truth in advertising. They are not obliged to do any research if they choose not to.
I make a point of posting on other sites about how AMDs drivers are superior to Nvidias. Which is true.Let's be honest. Even if they do, so many people are quick to sling out every Nvidia marketing line possible as to why they can't even consider an AMD card.
You'd think going based off online comment sections and forums that almost every home PC builder is a big time streamer who also wants to do some machine learning on the side. Also they only play DLSS/raytraced games.
Oh, and AMD drivers personally kicked their dog sometime in the past 10 years. Unforgivable.
Because proving objectively that one manufacturer's drivers are better than the other is pretty difficult. How are you proving that they are better?I make a point of posting on other sites about how AMDs drivers are superior to Nvidias. Which is true.
The hate in response is entertaining. Also shows how well Nvidia has the average non [H] gamer snowed.
"How do you know AMDs drivers are horrible" - to average gaming forum NV shill complaining about AMDs software.
Response from NV shill... "Because I read it on this forum, posted by another gamer"
"Really, wonder what AMD cards they had?"
"Rage something.... they're all shit"
Pretty much how every one of those conversations goes.