AMD Talks Radeon FreeSync on TVs with PCPer

FrgMstr

Just Plain Mean
Staff member
Joined
May 18, 1997
Messages
55,727
Ryan and Antal sit down and shoot the bull about FreeSync usage on TVs. Variable Refresh Rates on an open platform are a thing of the future....hopefully. The biggest problem with VRR is the fact that until you see it and use it in person, it is very hard to appreciate just how big a deal VRR is. Same thing with HDR. You have to see it to understand it.

Check out the video.

Interested in new gaming displays? Interested in new gaming displays that can also do HDR? Then you are going to want to watch our live stream with AMD about its plans for the future of FreeSync. AMD will be discusses changes to FreeSync at our event, with maybe an additional surprise or two along the way.
 
I wasn't impressed with VRR, but I probably didn't give it an adequate trial. I've been using 3 Monoprice 30" 2560x1600 IPS screens for a few years. I tried replacing them with 3 Viewsonic 27" 2560x1440 VA Gsync monitors, but 2 of the 3 showed up with broken panels. I tested the unbroken one, and it looked fantastic, beating the pants off IPS on black levels. I was playing Deus Ex: Mankind Divided, which was pretty much always running 60FPS on my Monoprice screen. Other than improved blacks, it wasn't noticeably better on the Viewsonic at higher refresh with Gsync. If I'd tried a game that wasn't already running 60FPS for me, I might have seen a difference.

HDR can look great on my TV, but I've noticed the shows that use it like to overuse it and blind me.

All I really want is 30" OLED 16x10 (4K or 2K) monitors with 120 Hz refresh, with at least 2 DP inputs that can be switched via remote control. Is that so difficult?
 
I love the fact that an open variable refresh standard may actually be taking off for real.

My biggest problem with it right now is that it is AMD only. I like AMD, and I would buy an AMD GPU, but I also like 4K, and AMD isnt very competitive at 4K resolutions. (Heck my Pascal Titan X on Water barely handles the resolution)

If Nvidia added support for this (why would they, it would hurt G-Sync sales/licensing) or if AMD were to come out with a next gen GPU with greater than 1080TI performance we could have the best of both worlds. Hope springs eternal?
 
I’d love too. However, I don’t want to pay to downgrade from Maxwell Titan X. Nor do I want to pay over MSRP for a Vega 56 or 64.
 
I’d love too. However, I don’t want to pay to downgrade from Maxwell Titan X. Nor do I want to pay over MSRP for a Vega 56 or 64.
I don't think VRR is a strong factor or motivation for upgrading even if there was an option for upgrading into. It's an incremental thing that solves a minor problem, which I don't even notice at 144Hz.

Once dynamic resolution becomes a part of the protocol stack, then I will lean in... at this rate, I suppose I'll wait another 15 years.
 
I agree, I won't ever have a non-vrr display for gaming again. Then again, I stay in the mid-range these days.

I haven't seen hdr in person yet, and kinda don't want to. The whole you don't know what you're missing til you see it thing is true IMO, but I'm happy with tear free non-hdr and my money right now. ;)
 
HDR can look great on my TV, but I've noticed the shows that use it like to overuse it and blind me.

It is a problem for games too. The Division loves to shine giant flood lights in your eyes, and The Witcher 3's thunder storms are super annoying now. I am just hoping that once it becomes the norm people will stop having to "prove" they support it. Because in games like Forza Horizon 3 it is amazing to watch the sun rise/set.
 
If they could deliver a 5+ year monitor it would be one thing. But its all marketing fluff.

No ones been able to deliver a 24" 1080p with high hertz, good VRR, good HDR and good color gamut. (By good, i dont mean 8 bit or DCI-P3, i mean HDR 1000 and Rec 2100 color gamut)

Everytime i read about the latest GSync for FSync monitor or TV, i always find in the fine print that its mostly lies.

They're doing high hertz using 8 bit colors, ... pick your deception.
 
  • Like
Reactions: N4CR
like this
Once dynamic resolution becomes a part of the protocol stack, then I will lean in... at this rate, I suppose I'll wait another 15 years.
Linux has you covered, fam! Thank Wayland and all the f/oss desktop environments for that. Of course, there are still some issues, but it's working pretty well now (almost a decade, probably more, after they began defining the protocol), with most video drivers supporting it.
 
  • Like
Reactions: ltron
like this
Still waiting for a VRR 30->75 Hz LFC (Low framerate compensation) HDR 1440p or better monitor with DCI-P3 or better at a reasonable price. <$500

The thing that pisses me off now is most monitor manufacturers putting freesync out don't publish the VRR output range. 50->70Hz is worthless. If I go to the mfg website and pull the specs and manuals all they do is list "Freesync" mode without any numbers. That really pisses me off. It makes it look like they are hiding something and that keeps me from buying. Maybe they are thinking someone bulking out 5x's or more for a freesync monitor wouldn't know what to look for.

Or they overdrive the crap out of TN panels for crappy color quality.

I saw a great deal today on an Acer 27" 1440p 4ms but I can't find the refresh range anywhere.
 
If they could deliver a 5+ year monitor it would be one thing. But its all marketing fluff.

No ones been able to deliver a 24" 1080p with high hertz, good VRR, good HDR and good color gamut. (By good, i dont mean 8 bit or DCI-P3, i mean HDR 1000 and Rec 2100 color gamut)

Everytime i read about the latest GSync for FSync monitor or TV, i always find in the fine print that its mostly lies.

They're doing high hertz using 8 bit colors, ... pick your deception.

In the display world it's more a pick your poison.
 
Can't watch an hour of youtube at work, anyone want to share what the couple of surprises were?

Thanks
 
I wasn't impressed with VRR, but I probably didn't give it an adequate trial. I've been using 3 Monoprice 30" 2560x1600 IPS screens for a few years. I tried replacing them with 3 Viewsonic 27" 2560x1440 VA Gsync monitors, but 2 of the 3 showed up with broken panels. I tested the unbroken one, and it looked fantastic, beating the pants off IPS on black levels. I was playing Deus Ex: Mankind Divided, which was pretty much always running 60FPS on my Monoprice screen. Other than improved blacks, it wasn't noticeably better on the Viewsonic at higher refresh with Gsync. If I'd tried a game that wasn't already running 60FPS for me, I might have seen a difference.

HDR can look great on my TV, but I've noticed the shows that use it like to overuse it and blind me.

All I really want is 30" OLED 16x10 (4K or 2K) monitors with 120 Hz refresh, with at least 2 DP inputs that can be switched via remote control. Is that so difficult?

I'm actually glad my 4K screen predates HDR by a year because to me the way that tech is implented tends to make everything look horrifically over-saturated to the point that my eyes start to water. Only annoying thing is my 4K Blu-Ray player throws up a big notice each time I begin to play a 4K disc that the whole thing would be so much better if only I had an HDR capable TV.
 
Won't it be wonderful when the pc industry finally pulls it's head out of it's collective ass concerning video cards and displays?
 
  • Like
Reactions: N4CR
like this
I have used two different monitors with g-sync and was not the least bit impressed overall especially considering how much extra it cost. There are delusional people that think g-sync works perfectly with every game which is a load of crap. Some games do have issues with it and in fact regular vsync was better in some cases than g-sync. Hell there were several games that have came out in the last year where g-sync flat-out did not work properly but so many oblivious people don't even notice it. The people that say it works perfectly in all games are the same type of people that will tell you SLI works perfectly and that every game supports ultra-wide monitor support with no tweaking also.
 
I have used two different monitors with g-sync and was not the least bit impressed overall especially considering how much extra it cost. There are delusional people that think g-sync works perfectly with every game which is a load of crap. Some games do have issues with it and in fact regular vsync was better in some cases than g-sync. Hell there were several games that have came out in the last year where g-sync flat-out did not work properly but so many oblivious people don't even notice it. The people that say it works perfectly in all their games are the same type of people that will tell you SLI works perfectly and that every game supports ultra-wide monitor support with no tweaking also.
How about specifics

I’ve used freesync for a year and while it always worked with Fury X it was iffy with Vega drivers at launch. Gsync has been without issue to me for the six months I’ve used it. (Once I learned to enable fast sync that is). With the exception of one indie game called Kingdom Rush that blinks when above 1100 FPS.
 
Against my wishes. AMD's GPU division has become a joke.
Why? They make a card as good if not better than what you are running now, or is 50W too much of a {S}oft dealbreaker?
I see these comments all the time about AMD GPUs from people not even running a Ti. If you don't have a Ti or better, or some tiny itx box, you can't say they are not competitive, outside of power.
 
Why? They make a card as good if not better than what you are running now, or is 50W too much of a {S}oft dealbreaker?
I see these comments all the time about AMD GPUs from people not even running a Ti. If you don't have a Ti or better, or some tiny itx box, you can't say they are not competitive, outside of power.
He can say they aren’t competitive without owning them because objectively it’s true. He doesn’t have to own anything in particular to look at the objective evidence for gamers...

1080 and 1080ti >= Vega 64
1070 ti and maybe 1070 >= Vega 56
1060 6gb >= RX580

Generally in street price, majority of performance benchmarks, lesser power usage, and quality/maturity of drivers.

The rest of the product line is generally immaterial for most enthusiasts on these boards.

Freesync is AMDs ace in the hole and it’s been that way since Fury X days. Its still that way. And why Nvidia hasn’t integrated freesync (for free) and just owned the board top to bottom a generation ago I have no idea. They could implement freesync and allow its use without ridding gsync support for the Nvidia loyalists. The only other bullet for AMD was crossfire, which IMO was generally superior in scaling to SLI. But that’s no longer a thing since MGPU is phasing out with DX12’s comeuppance.

BTW
I hope Freesync amd free VRR tech wins and Gsync fades away. It’s best for all.
 
Linux has you covered, fam! Thank Wayland and all the f/oss desktop environments for that. Of course, there are still some issues, but it's working pretty well now (almost a decade, probably more, after they began defining the protocol), with most video drivers supporting it.
Sure. Have the OS video drivers retain state to the monitor and perform mode scaling in DRI, but that's literally the monitors job. I'm just sort of ranting about how sad it is that Wayland had to step in and actually provide a sensible technology which was conceived 35+ years ago as the multisync monitor in software, and make every brand new monitor look bad while it struggles to toggle between antiquated backward compatibility modes (which a monitor _needs_ to support) by literally powercycling the receiver. The professional DTV world had this down perfectly for well over a decade now. You can buy one of those used for $50 and it has seamless mode changes. Of course, those don't have freesync, hdr, or refresh rates over 60, but solve the "hard" (ahem, marketable) problems first. ;)
 
How about specifics

I’ve used freesync for a year and while it always worked with Fury X it was iffy with Vega drivers at launch. Gsync has been without issue to me for the six months I’ve used it. (Once I learned to enable fast sync that is). With the exception of one indie game called Kingdom Rush that blinks when above 1100 FPS.
So in other words you're oblivious just like I said. Hell there some games that you can't even turn on g-sync because it's greyed out because they already know it doesn't work with the game. They made that clear when g-sync came out that some games will not be compatible with it and that is a fact. There's no point in going back and forth though because again there will always be people like you that somehow have it in their mind that every game works right. Again it's no different than the people that talk about how great SLI is. And fast sync is the absolute worst thing you should enable unless you have an extremely high frame rates but then again you're oblivious to regular g-sync issues...
 
So in other words you're oblivious just like I said. Hell there some games that you can't even turn on g-sync because it's greyed out because they already know it doesn't work with the game. They made that clear when g-sync came out that some games will not be compatible with it and that is a fact. There's no point in going back and forth though because again there will always be people like you that somehow have it in their mind that every game works right. Again it's no different than the people that talk about how great SLI is. And fast sync is the absolute worst thing you should enable unless you have an extremely high frame rates but then again you're oblivious to regular g-sync issues...
So, inform me jackwagon
 
Placebo goes a long way to making people believe something is working when it's really doing nothing. And people are more likely to experience placebo when they feel like they've invested in it.
 
Placebo goes a long way to making people believe something is working when it's really doing nothing. And people are more likely to experience placebo when they feel like they've invested in it.
Exactly. You can look on some of the steam forums where it's obvious that g-sync is not working in a specific game but then you'll see somebody say oh it works just fine for me. And then you can even see in the official support for that game that indeed g-sync is not working LOL
 
He can say they aren’t competitive without owning them because objectively it’s true. He doesn’t have to own anything in particular to look at the objective evidence for gamers...

1080 and 1080ti >= Vega 64
1070 ti and maybe 1070 >= Vega 56
1060 6gb >= RX580

Generally in street price, majority of performance benchmarks, lesser power usage, and quality/maturity of drivers.

The rest of the product line is generally immaterial for most enthusiasts on these boards.

Freesync is AMDs ace in the hole and it’s been that way since Fury X days. Its still that way. And why Nvidia hasn’t integrated freesync (for free) and just owned the board top to bottom a generation ago I have no idea. They could implement freesync and allow its use without ridding gsync support for the Nvidia loyalists. The only other bullet for AMD was crossfire, which IMO was generally superior in scaling to SLI. But that’s no longer a thing since MGPU is phasing out with DX12’s comeuppance.

BTW
I hope Freesync amd free VRR tech wins and Gsync fades away. It’s best for all.


1060 < RX580 on DX12
1060 > RX580 on DX11

Toms even recommends a RX580 over a 1060. I recommend based on what is more future proof and what you are going to play the majority of your games on. Two years ago I got a 1060FE for my first nephew. And a RX580 for my second nephew.

And if you break down the TCO

1070ti+gsync < Vega 64 + Freesync DX11.. The later is a little cheaper
1080 gsync < Vega 64 + Freesync DX12

I would be all for NVIDIA adopting Freesync. But cutting into their own profit margins? Not a chance. NVIDIA would rather go down in flames than admit open architecture is better for the community.

If price is no object, get the $2000 gsync monitor and the $900 1080ti. But even six digit incomes don't have infinitely deep pockets.
 
He can say they aren’t competitive without owning them because objectively it’s true. He doesn’t have to own anything in particular to look at the objective evidence for gamers...

1080 and 1080ti >= Vega 64
1070 ti and maybe 1070 >= Vega 56
1060 6gb >= RX580

Generally in street price, majority of performance benchmarks, lesser power usage, and quality/maturity of drivers.

The rest of the product line is generally immaterial for most enthusiasts on these boards.

Freesync is AMDs ace in the hole and it’s been that way since Fury X days. Its still that way. And why Nvidia hasn’t integrated freesync (for free) and just owned the board top to bottom a generation ago I have no idea. They could implement freesync and allow its use without ridding gsync support for the Nvidia loyalists. The only other bullet for AMD was crossfire, which IMO was generally superior in scaling to SLI. But that’s no longer a thing since MGPU is phasing out with DX12’s comeuppance.

BTW
I hope Freesync amd free VRR tech wins and Gsync fades away. It’s best for all.

You just listed why they are competitive.
Take modern drivers, run the cards, they are competitive (within a few frames of each other typically) There are more than a few on here who have jumped to AMD and preferred it over Nvidia in recent times. Prices are almost back to MSRP, Vega 64s are in stock $570 on the egg and V56 is $479, identical to 1070Ti best pricing.
So again, we are down to 50 Watts max at higher end stuff.

1080Ti and up (three cards up to 3k$) are the only cards they are not competitive with performance-wise.

Agreed, Freesync is also a huge part of why I am going AMD for next upgrade round. Nvidia will learn the hard way and then finally change this (like allowing 10bit outside of DX windows for paxwell) as mainstream becomes more aware of the emerging Freesync TV options. AMD just needs more horsepower to drive 4k properly, as does Nvidia to a lesser extent.
 
Exactly. You can look on some of the steam forums where it's obvious that g-sync is not working in a specific game but then you'll see somebody say oh it works just fine for me. And then you can even see in the official support for that game that indeed g-sync is not working LOL
So you have no significant examples off the tip of your tongue? In another thread I recall you said you've tried 2 gsync monitors and dismissed the technology --- yet you can't list a game or two that doesn't work with G-sync? Even I did that - when I mentioned Kingdom Rush above. So what gives man?

I've used both Freesync and Gsysn and when they are working correctly within their respective FPS range - they are indistinguishably smooth IMO. When they aren't on - I can tell nearly immediately, and it annoys me. I can't tell a significant difference between 75hz (HP Omen 32" Freesync) and 144hz (Acer 35" Freesync), but I can certainly tell when freesync, or gsync is on or off. When it's on I NEVER feel any need to look at the FPS counter until it dips into the low 40 range for NVidia, or 48FPS for AMD. It feels the same buttery smooth regardless of FPS. And since I know what I like and have tried going back and forth several times with 4 different premium monitors in the last couple years. I'll just disagree with you here in that you think the tech is placebo.

Freesync and Gsync are absolutely worthwhile tech. There's a lot of bunk data out there were people can't figure anything out and dismiss it because they don't know what they are doing. (example) I had two different long time gamer friends that immediately bought freesync monitors after demoing freesync on my setup. One sold his nvidia 1070 card a year ago to buy a Fury X at the time --- because his monitor already had freesync. He dismissed the idea of VRR completely until he demoed freesync on my Omen setup - now he won't do without. I'm the same. It's been the single biggest positive game changer for overall gaming experience in a decade.
 
Last edited:
I pulled the trigger in a an Asus 144Hz ips 27in 1440p with Freesync and an RX480 when that came out. I actually paid 225 for an open box Nitro pre crypto rush. Anyway the Fresync range on my panel is 30-90Hz... This sounds fine and good but I had already been spoiled by a smooth fast 100fps+ gaming experience while just setting 144Hz in windows. Sure I was dialing in game settings to get there on a midrange card but 20 years of PC gaming so kind of used to doing that. I enjoy the high refresh rate even in the desktop environment. It locks the monitor to 90Hz when Freesync is enabled and frankly it feels like crap by comparison.

I tried a Vega 56 on this as well and there was tearing and all sorts of nonsense when Freesync was on. Likely the early driver woes that have been mentioned.
Back on a 1060 and sold my AMD cards to miners. Set 144Hz no VRR and it’s fine for me.


TDLR
I do feel that VRR has merit and that Freesync being an open platform is definitely the way forward. My experience with my specific monitor was underwhelming however.
 
Jeez. These guys are just blathering on about useless crap. Even at 1.5x speed it is intolerable. When do the good bits come along?

This is why a well written article will always trump a video.

always.
 
My opinion: Freesync is the real deal, so is Gsync. Radeon cards aren't really competitive above the 1060 level, and even then its hit and miss. I sell many systems with ~reasonably priced 570s and 1080p Freesync monitors for much less than an equivalently great Geforce setup.

but at the high end, Nvidia rules the roost, and I WISH it were more competitive there.
 
Only annoying thing is my 4K Blu-Ray player throws up a big notice each time I begin to play a 4K disc that the whole thing would be so much better if only I had an HDR capable TV.

Let me guess - Sony BR player?
 
Turns out Samsung has a 1440p 27" 144 Hz FreeSync 2 VA HDR monitor with LFC for <$500. Finally! Now get me a 7nm Vega that runs up to 1800MHz. It's the only one with these specs <$500.
 
Turns out Samsung has a 1440p 27" 144 Hz FreeSync 2 VA HDR monitor with LFC for <$500. Finally! Now get me a 7nm Vega that runs up to 1800MHz. It's the only one with these specs <$500.

Just bought a 27" VA 1080p (*gasp* yup 100 constant FPS > anything else) Acer 144hz Freesync Panel (still rocking team green though) for $215.00 all said and done. VA gaming panels have come a long way from ones I have used over the past five years (QNIX, ect). The pixel response is still noticibly slower than my TN 144 panel but the contrast and blacks; feels like I was looking at black and white before.
 
Last edited:
Back
Top