Coreteks

The official website for the youtube channel Coreteks

FidelityFX and NVidia Sharpen on a CRT (WOW!)

AMD recently added support for FidelityFX on their Polaris GPUs. The RX 580 is still my recommendation for 1080p, and with some quality settings adjustments it will do 1440p pretty well. Since the RX 580 has been reviewed and benchmarked a million times already by many reputable sources in many scenarios and games, I thought I’d take a look at it from a different perspective: by testing it with two monitors that you wouldn’t typically use with this sort of GPU, a 32 inch 4K monitor and a 19 inch CRT!

Test System

  • GPU: Sapphire Nitro+ 8GB RX 580
  • GPU2: Zotac 1080Ti
  • CPU: Ryzen 3900X
  • RAM: 16GB 3600 CL16 DDR4
  • MB: MSI X570 Godlike
  • Game Drive: Gigabyte M.2 2TB GEN4

Starting with 4K, the monitor I’m using is the LG 32UD60-B, an affordable 4K VA monitor that is not exactly meant for gaming but games decently (it supports Freesync and has low input lag, but only does 60hz).

To no one’s surprise, running games at 4K natively saw the RX 580 struggle to get near 60 FPS. In Borderlands 3, with High Settings but shadows down to medium and Volumetric Fog turned off, the game runs at 49 FPS average, with a 1% low of 23.7 FPS. As you can see from the footage below, this is a passable experience, and probably better than what you would get on a console, both in terms of framerate and visual quality:

I would argue that 60 FPS is the bare minimum for a shooter on the PC though, and dropping the resolution down to 1440p should get us there. Along with adjusting the resolution I also turned the graphical settings down to medium, which resulted in exactly 60.9 FPS average. If you’ve ever ran a game at 1440p on a 4K monitor you’ll know that the image doesn’t scale all that well and looks blurry. To fix this AMD now offers FidelityFX CAS (sharpening) support in their Polaris GPUs, including the RX 580 (We will look at NVidia’s alternative option in a second).

As the images above show, 1440p Upscaled + FidelityFX does offer an improvement in sharpness compared to regular 1440p, but it’s still far from the crispiness that native 4K offers. At 32 inches and above the difference is indeed very perceptible. Now that doesn’t mean that the game isn’t a good experience with 1440p + FidelityFX – it is – but don’t expect miracles. The idea here is having the option to do 4K on a budget. You can still run native 4K in less demanding games (CIV VI is amazing on a 32 inch 4K monitor and runs absolutely fine with the RX 580!), and then drop down to 1440p without sacrificing much in terms of visual quality or framerate.

Out of this world boring

The Outer Worlds

Next up is The Outer Worlds, a game I had high hopes for given Obsidian’s track record, with excellent titles like Never Winder Nights, Fallout 1 & 2 and Fallout New Vegas, Tyranny, and of course Pillars of Eternity. The Outer Worlds is, unfortunately, very mediocre and mostly boring. The writing is perhaps its saving grace (delivered with excellent voice acting), but the core elements of the game are too simplistic to be engaging. The gunplay is basic, the dialog trees feel mostly inconsequential, the character progression is disconnected from any impact on the world (other than opening up some dialog options), and the game world(s) – albeit pretty – feels very empty and soulless. Anyway, the RX 580 performs decently well in this game if you are willing to make a few compromises.

At native 4K the RX 580 managed a horrendous 21.7 FPS average with Ultra Settings (shadows on medium).

Radeon Settings menu, you’ll find the Scaling and Sharpening options under “Display”

Dropping the resolution down to 1440p and turning GPU Scaling and Image Sharpening in the Radeon Settings menu saw decent gains in performance, to 40 FPS average, again with minimal loss to visual quality compared to native 4K. Here’s a comparison between 1440p, 1440p Upscaled + Sharpening, and native 4K:

1440p Upscaled + Sharpening looks pretty good.

So if you have need of a large 4K display (for image or video editing for example), but you want to do some gaming now and then, the RX 580 delivers a very decent 1440p upscaled performance considering it only costs $175 (there are even cheaper models.)

RX 580 + CRT = WIN

One of the problems with CRT monitors is a certain lack of sharpness – incidentally some people see this as a benefit, confusing the smoother image with “better quality” than LCDs. This is obviously not the case. The image is smoother because it loses a bit of detail compared to a good quality LCD panel, the pixels blur into each other so to speak. There are many benefits to CRTs though, like the high refresh rates, the fact that CRTs don’t use a fixed pixel grid (which means it’s not tied to a specific resolution, but again, will lead to a loss of detail), and most importantly the fact that motion resolution is maintained when you are moving the cursor around (aka moving the camera around in a game-world – aka LOOKING AROUND in an first-person-shooter). Motion resolution is probably the best reason to use a CRT today. In competitive shooters this means that you will see the same image resolution when you are moving the cursor as when the image is static, something that you don’t get with LCDs (although super high-refresh rate ones can address this LCD drawback somewhat).

For this test I’m using a good quality CRT, the Sony Trinitron E430. I ran it at 1280 x 1024 at 90Hz.

VGA madness

When I first tried to use the RX 580 with this CRT I kept getting system crashes. Since Polaris GPUs don’t have an analog output I had to use an active converter to connected it to the VGA input on the monitor. After much fucking about I realized that I needed a powered active converter and all went smoothly after that.

Having a powered active converter is key. Plug the other end into a USB port on the motherboard.

FidelityFX adds another variable into the intense and controversial debate between CRT purists and sensible people. The fact that FidelityFX adds sharpening in a non-destructive way means that one of CRT’s Achilles heels gets mitigated to some extent.

The effect this has on image sharpness is extremely hard to show on camera, of course, but I tried to capture it nevertheless:

Radeon FidelityFX Sharpening keeping CRTs alive.

Like I said, it’s not easy to tell the difference in the video, but in person it is noticeable enough to improve the experience. In darker scenes the lack of detail is still a problem, especially if you plan on playing competitive shooters (one of the main reasons to own a CRT). I really wish FidelityFX had a slider to control sharpening intensity. I mentioned this when the RX 5700s came out, but AMD still hasn’t added it to Radeon Settings yet. But… NVidia have!

What about NVidia?

Even though NVidia’s sharpening implementation isn’t as good (it has a significant performance overhead compared to AMD’s), it does have a slider to increase the sharpening effect, so let’s see if that helps with the CRTs lack of detail.

Left: Sharpen OFF – Right: Sharpen ON

WOW! Once you turn the sharpening slider all the way to max the difference is not only staggering on the CRT, it actually looks amazing! The side-by-side above won’t make it justice.

In real person the difference is incredible, it feels like this filter was made for CRTs. The lack of detail in the original content is almost completely mitigated by the sharpening filter, seriously, you’ll have to see it to believe it.

I tried capturing the slider change on video, but it’s impossible to get even close to what it looks in real life. I tried anyway, in the video below I am moving the slider from 0 to 100 and back repeatedly, try to spot the difference (warning, video contains flickering!):

I move the sharpening slide from 0 to 100 and back to 0. Repeat.

I had suspected AMD’s FidelityFX would help the CRT argument but I didn’t expect the impact to be this big when you have control over how much sharpening is added.

Now obviously the effect is way too extreme if seen on a regular LCD panel, as you can see from the raw screenshots below. The blades of grass look particularly destroyed by the extreme sharpening (left image below).

Too much sharpening for a regular LCD screen

Unfortunately not every game is supported by NVidia’s Freestyle feature where you’ll find this sharpening effect. The Outer Worlds for example doesn’t have this available. I also tried a particularly difficult game to play on a CRT monitor, Battlefield 1. The enemy players blend with the environment and the added sharpness helped a lot here, but in darker areas the high contrast on the CRT left me at a disadvantage. Increasing the brightness in-game helped a bit, but I wouldn’t use the CRT as my display for this particular game (or any competitive game where players can hide in very dark areas).

Conclusion

The RX 580 is still a really good buy if you can get it for around $150-$180. It will play games at 1440p upscaled to 4K with very acceptable framerates. But the real surprise was how well Sharpening filters combine with a good quality CRT monitor. I’ve decided to build a gaming rig specifically for it (Valfaris looks gorgeous on it). The fact that it runs lower resolutions means I won’t need high-end parts so it should be a fun and affordable project. Be sure to subscribe to the notifications on the sidebar so you don’t miss future articles.

Disclaimer: This article may contain amazon affiliate links, if you buy a product using these links I will get a small percentage of the sale. (It’s pennies, but it helps).

coreteks

One thought on “FidelityFX and NVidia Sharpen on a CRT (WOW!)

  1. APUs really benefit from faster RAM so if you are building something similar I would recommend a faster kit, like a dual channel 2666 kit. I had this 8GB module around and for the type of games and media it will be running it s more than fine.

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top