So this a new method that simulates a CRT and genuinely reduces motion blur on any type of higher framerate displays, starting a 120hz. But it doesn't dim the image like black frame insertion which is the only current method that comes close to the clarity of a CRT. But it also simulates other aspects of CRT displays, right?
Can you use this method just to reduce blur without reducing brightness, on any game? They mention reducing blur for many things other than retro games in "Possible Use Cases of Refresh Cycle Shaders" but does reducing blur in a flight simulator also make it visually look like a CRT with phosphors?
They do mention that it does reduce brightness. The selling point compared to strobing sounds to be less eyestrain. I'd expect it to lose more brightness than strobing, considering the lower relative pixel on time.
This is about improving motion clarity, so each displayed frame of moving content looks crisp rather than having blur (something that monitors can struggle with even at high refresh rates / high Hz).
Most good monitor reviews of high Hz displays (eg: 120Hz+) take fast photographs of moving objects (typically from Blur Busters' 'Test UFO' web page) to demonstrate how good or poorly a monitor handles fast moving content.
One technique of significantly improving motion clarity is inserting frames of pure black in the display output (aka BFI, black frame insertion). A downside is some are sensitive to this where it causes eyestrain.
This CRT beam simulating shader is said to be similarly effective to BFI at improving motion clarity but with the benefit of reducing eyestrain. However from what I understand the current version is limited to simulating a lower Hz display and requires a higher Hz monitor.
All this is distinct from the kind of in-media motion blur that can be enabled in games or seen in recorded video. It's instead about the monitor not being able to render fast moving content clearly enough which leads to non-crisp output frames.
What is the method used on newer TVs that attempts to double the framerate / interpolate frames / make everything shot on film look like an overlit soap opera? I find it impossible to watch; it destroys the lighting and the performances. My recollection of CRT TVs was that they had a lot of blur, both motion and spatial, and that was kind of what made them feel warmer and more analog / less overly crispy.
That's typically called 'motion smoothing' and yeah that's trying to interpolate frames to manipulate lower framerate video (like 24FPS) into higher framerates in an attempt to make scenes like panning shots 'smoother' at the expense of a soap opera feel and interpolation artifacting.
Whereas what Blur Busters (and similar enthusiasts) are focused on is how accurately frames are (perceptibly) displayed on the screen, so ideally each input frame is perfectly presented without any interference from prior frames (due to limits of panels in keeping up with changing the pixels from one frame to another, very rapidly, causing blurring).
The ultimate goal, in a perfect scenario, is for input from say a video game running at 60 frames per second to have each frame perfectly rendered like individual screenshot stills, one after the other. In reality for most monitors displaying such content there's not enough distinct separation between frames, leading fast changing pixel content (like objects moving) to blend into each other, causing blurring at the monitor level.
The BFI technique, by inserting alternating black frames in the output, mitigates the inter-frame blending issues since instead of the prior frame being various colors (ie: of the prior input frame) it's starting from pure black which dramatically reduces frame blending artifacts and perceptibly makes the motion clarity more distinct.
It’s not that the frames blend in the screen. Screens are perfectly capable of switching the pixels fast enough. It’s rather that each frame is displayed for to long.
In a CRT the “pixels” start to fade immediately leaving the full screen mostly dark as the beam sweeps over the screen. It never shows a full frame.
One could say that modern screens are more like slide shows, while BFI tries to make them more like stroboscopes.
The blurring effect is more pronounced ate low refresh rates, its just that BFI requires at least 120hz to male sense at all.
Yes, the relevant blur here is in your retina, as it tracks a moving screen object, called "sample and hold" blur. 60 fps is not enough when the pixel persists for the full frame duration -- the pixels smear across your retina.
They do darken that fast (not fast enough you can't catch it in a high speed camera, but much faster than a frame). Most of the apparent persistence in the CRT comes from the retina/camera exposure, not the phosphor. A CRT has a sharp peak of light that quickly falls off, but the peak is bright enough that even though it is brief, when averaged out in an exposure in the camera it still appears bright enough to form an image.
They frequently flicker or are only showing part of an image in video footage or photos, for exactly this reason. They're a right headache to film clearly.
The "soap opera feel" is precisely the goal of motion interpolation on 24 fps source. It reminds people of soap operas because they were often broadcast 60i instead of 24p. The weird part is that many people somehow prefer the terrible 24 fps to higher film frame rates.
I think you're right that it's partly a subconscious association with what we're used to seeing at higher frame rates (TV and video games).
But it's also that a DP / cinematographer on a movie is crafting shots in ways that knowingly make use of a 24 fps framerate. There are consciously chosen effects that are in the shot, particularly directional motion blur that acts as a visual cue (like in action sequences or with hand-held cameras), which gets destroyed when the frame rate is increased without adding additional blur in the right places in post. Rather than a smoothly increasing/decreasing blur that creates a sort of ease-in-out as the camera or subject changes speed, you end up with jagged, rapid shifts in direction and speed which make well-crafted motion sequences feel either jarring or as if they're not really moving. I suspect that if a director were shooting originally at 60 fps they would have probably made the necessary adjustments in post production to achieve the effects they wanted, which they initially got by tuning their shots to 24 fps. But when it's done automatically by some software in a TV set, all of that subtlety is lost.
It's sort of like if you took an oil painting and say the colors look more lifelike in digital reproduction: That may be true, but it wasn't the artist's intent. The artist understood they were working with a particular palette and worked within its limitations to achieve their desired effects.
My contention is that it's not the higher frame rate which bothers people, per se, but that all the motion blur (slight as well as heavy) in a well-shot 24 fps movie is intentional, and therefore the problem is that removing it detracts from the intended effect of the shot. If you chose to replicate the original blur across 60 fps, rather than interpolate the sharpest possible interstitial frames, people might not have the same negative reaction.
First thing I turn off in every single game is motion blur. It’s only useful in racing sims to have more sense of speed but that’s also a personal taste.
Motion blur made a bit
more sense on the 30fps Xbox 360 and PS3 games.
Our eyes are constantly and mostly unconsciously tracking moving objects in our field of view in order to keep them still relative to our eyes. It's called Smooth pursuit:
https://en.wikipedia.org/wiki/Smooth_pursuit
This is because our retina has a very low "refresh rate", which means things can easily blur together. Smooth pursuit prevents that. However, modern sample-and-hold displays like LCD and OLED work against Smooth pursuit. If you watch anything moving on a screen (including "still" objects moving on screen due to camera movement), your eye will automatically track those objects if they are momentarily the focus of attention, which should make them be still relative to your eyes and thus appear sharp.
However, since the tracked object is being still relative to your eyes and the individual frames on screen are being still relative to your screen, the frames move (are not being still) relative to your eyes. Which means they appear blurry during smooth pursuit, when in reality they should be perfectly sharp.
For example, your eyes track a sign that moves on the screen due to camera movement. Say it moves 10 pixels per frame horizontally. This means you will see a 10 pixel wide horizontal blur on this sign. Which could make it unreadable. In reality (without screen with a real sign) the sign would appear perfectly clear.
On CRT screens this doesn't happen (to the same extent) because the frame is not displayed for the entire frame time (e.g. 1/60th of a second) but much shorter. The CRT just very quickly flashes the frames and is dark in between. Strobing/flickering basically. So if the tracked object moves 10 pixels per frame, the frame might only be (say) visible for 1/5th of that frame time, which means it moves only 2 pixel while the frame is actually on screen. So you get only 2 pixel blur, which is much less.
Of course at 60 FPS you might instead get some degree of perceptible flicker (computer CRTs therefore often ran higher than 60) and in general the overall achievable screen brightness will be darker, since the screen is black most of each frame time. CRTs had a low maximum brightness. But they had very little of the "persistence blur" which plagues sample-and-hold screens like OLED and LCD.
The motion blur intentionally introduced by video games is there to make moving objects appear smoother that are not tracked by our eyes. In that case motion blur is natural (since smooth pursuit doesn't try to remove it). So some forms of motion blur are undesirable and others are desirable.
The optimal solution would be to run games (and videos content in general) at an extremely high frame rate (like 1000 FPS) which would introduce natural perceptible motion blur where it naturally occurs and remove it where it doesn't naturally occur (during smooth pursuit). But obviously that would be computationally an extremely inefficient way to render games.
By the way, if you have a screen with 120+ Hz you can test the above via this black frame insertion demo, which emulates how CRTs work:
On my 120 Hz OLED screen, the 40 FPS (1 frame + 2 black frames) UFO looks as clear as the native 120 Hz UFO. A real 60 or even 80 Hz CRT screen would be even better in terms of motion clarity. Perhaps better than a 240 or even 480 Hz OLED.
Yeah they are two different effects. Theres motion blur on individual objects that you want (as human eyes see/have) then there is full screen motion blur that is due to the display technology (lcd,oled etc) that you dont want (as human eyes dont see/have). CRTs dont have this motion blur as the screen is blank most of the time - see slo mo guys on youtube for crt displays.
We need display manufacturers to provide a refresh cycle that is agnostic of the incoming signal hz sent down the cable AND to either provide shader support (ideally) at the displays hz OR to implement this shader.
There really is no need for an expensive RetroTink if we had this. Some manufacturer must be able to do it and the rest would follow.
The technique is for when you have X fps content and Y fps capable monitor, where Y > X. In games, you'll still render at your old FPS cap, but this shader is for relatively cheaply generating extra frames that will make the content look smoother / sharper.
You could definitely do this, but a lot of modern rendering techniques rely on having full copies of previous frames lying around, like TXAA, screen-space reflections, etc.
Does all this shader really get around is the problem of display inversion?
From Gemini: "Display inversion is the process of alternating the voltage between positive and negative for each pixel on an LCD screen to prevent damage. This process is called polarity inversion."
If display manufacturers knocked that on the head for certain scenarios then surely we could just have a simple block of horizontal screen scrolling down the display at high refresh rates?
Phosphor fall off as far as can be seen in Slo Mo Guys is quite a small effect not on the scale of this shader.
It’s on the article before this one. The tldr is that no, this doesn’t reduce latency so there’s no chance of making the original light guns work without modifying either them or the game.
Cycle refresh shaders where someting my last team really nailed. The key challenge was during the day there is a lot of sun. Adjustments can be made to the location, it really pays dividents.
So this a new method that simulates a CRT and genuinely reduces motion blur on any type of higher framerate displays, starting a 120hz. But it doesn't dim the image like black frame insertion which is the only current method that comes close to the clarity of a CRT. But it also simulates other aspects of CRT displays, right?
Can you use this method just to reduce blur without reducing brightness, on any game? They mention reducing blur for many things other than retro games in "Possible Use Cases of Refresh Cycle Shaders" but does reducing blur in a flight simulator also make it visually look like a CRT with phosphors?
They do mention that it does reduce brightness. The selling point compared to strobing sounds to be less eyestrain. I'd expect it to lose more brightness than strobing, considering the lower relative pixel on time.
I do not understand at all what this is talking about or why. Is it some elaborate joke?
Don't visual effects people go to lots of effort to add motion blur? Why would you want to remove it?
Why are they trying to simulate old CRT displays?
Can someone explain what this is about?
This is about improving motion clarity, so each displayed frame of moving content looks crisp rather than having blur (something that monitors can struggle with even at high refresh rates / high Hz).
Most good monitor reviews of high Hz displays (eg: 120Hz+) take fast photographs of moving objects (typically from Blur Busters' 'Test UFO' web page) to demonstrate how good or poorly a monitor handles fast moving content.
One technique of significantly improving motion clarity is inserting frames of pure black in the display output (aka BFI, black frame insertion). A downside is some are sensitive to this where it causes eyestrain.
This CRT beam simulating shader is said to be similarly effective to BFI at improving motion clarity but with the benefit of reducing eyestrain. However from what I understand the current version is limited to simulating a lower Hz display and requires a higher Hz monitor.
All this is distinct from the kind of in-media motion blur that can be enabled in games or seen in recorded video. It's instead about the monitor not being able to render fast moving content clearly enough which leads to non-crisp output frames.
Thank you, that's a really great explanation.
What is the method used on newer TVs that attempts to double the framerate / interpolate frames / make everything shot on film look like an overlit soap opera? I find it impossible to watch; it destroys the lighting and the performances. My recollection of CRT TVs was that they had a lot of blur, both motion and spatial, and that was kind of what made them feel warmer and more analog / less overly crispy.
That's typically called 'motion smoothing' and yeah that's trying to interpolate frames to manipulate lower framerate video (like 24FPS) into higher framerates in an attempt to make scenes like panning shots 'smoother' at the expense of a soap opera feel and interpolation artifacting.
Whereas what Blur Busters (and similar enthusiasts) are focused on is how accurately frames are (perceptibly) displayed on the screen, so ideally each input frame is perfectly presented without any interference from prior frames (due to limits of panels in keeping up with changing the pixels from one frame to another, very rapidly, causing blurring).
The ultimate goal, in a perfect scenario, is for input from say a video game running at 60 frames per second to have each frame perfectly rendered like individual screenshot stills, one after the other. In reality for most monitors displaying such content there's not enough distinct separation between frames, leading fast changing pixel content (like objects moving) to blend into each other, causing blurring at the monitor level.
The BFI technique, by inserting alternating black frames in the output, mitigates the inter-frame blending issues since instead of the prior frame being various colors (ie: of the prior input frame) it's starting from pure black which dramatically reduces frame blending artifacts and perceptibly makes the motion clarity more distinct.
It’s not that the frames blend in the screen. Screens are perfectly capable of switching the pixels fast enough. It’s rather that each frame is displayed for to long.
In a CRT the “pixels” start to fade immediately leaving the full screen mostly dark as the beam sweeps over the screen. It never shows a full frame.
One could say that modern screens are more like slide shows, while BFI tries to make them more like stroboscopes.
The blurring effect is more pronounced ate low refresh rates, its just that BFI requires at least 120hz to male sense at all.
Yes, the relevant blur here is in your retina, as it tracks a moving screen object, called "sample and hold" blur. 60 fps is not enough when the pixel persists for the full frame duration -- the pixels smear across your retina.
CRTs don't darken that fast, one way to observe this is that CRTs don't appear black in photos/video with shutter times << 1/60
They do darken that fast (not fast enough you can't catch it in a high speed camera, but much faster than a frame). Most of the apparent persistence in the CRT comes from the retina/camera exposure, not the phosphor. A CRT has a sharp peak of light that quickly falls off, but the peak is bright enough that even though it is brief, when averaged out in an exposure in the camera it still appears bright enough to form an image.
They frequently flicker or are only showing part of an image in video footage or photos, for exactly this reason. They're a right headache to film clearly.
(see this youtube video showing one in slow-motion to get an idea: https://www.youtube.com/watch?v=3BJU2drrtCM)
Thanks for the link. It seems i'd concluded this a bit wrong from seeing those half lit frames (like at 1:35 of this YT video).
You've never taken photos of a CRT, have you? Even at like 400 ISO equivalent, only about a third of the screen is illuminated.
>at the expense of a soap opera feel
The "soap opera feel" is precisely the goal of motion interpolation on 24 fps source. It reminds people of soap operas because they were often broadcast 60i instead of 24p. The weird part is that many people somehow prefer the terrible 24 fps to higher film frame rates.
I think you're right that it's partly a subconscious association with what we're used to seeing at higher frame rates (TV and video games).
But it's also that a DP / cinematographer on a movie is crafting shots in ways that knowingly make use of a 24 fps framerate. There are consciously chosen effects that are in the shot, particularly directional motion blur that acts as a visual cue (like in action sequences or with hand-held cameras), which gets destroyed when the frame rate is increased without adding additional blur in the right places in post. Rather than a smoothly increasing/decreasing blur that creates a sort of ease-in-out as the camera or subject changes speed, you end up with jagged, rapid shifts in direction and speed which make well-crafted motion sequences feel either jarring or as if they're not really moving. I suspect that if a director were shooting originally at 60 fps they would have probably made the necessary adjustments in post production to achieve the effects they wanted, which they initially got by tuning their shots to 24 fps. But when it's done automatically by some software in a TV set, all of that subtlety is lost.
It's sort of like if you took an oil painting and say the colors look more lifelike in digital reproduction: That may be true, but it wasn't the artist's intent. The artist understood they were working with a particular palette and worked within its limitations to achieve their desired effects.
My contention is that it's not the higher frame rate which bothers people, per se, but that all the motion blur (slight as well as heavy) in a well-shot 24 fps movie is intentional, and therefore the problem is that removing it detracts from the intended effect of the shot. If you chose to replicate the original blur across 60 fps, rather than interpolate the sharpest possible interstitial frames, people might not have the same negative reaction.
First thing I turn off in every single game is motion blur. It’s only useful in racing sims to have more sense of speed but that’s also a personal taste.
Motion blur made a bit more sense on the 30fps Xbox 360 and PS3 games.
Why exactly do you think motion blur is added?
Our eyes are constantly and mostly unconsciously tracking moving objects in our field of view in order to keep them still relative to our eyes. It's called Smooth pursuit: https://en.wikipedia.org/wiki/Smooth_pursuit
This is because our retina has a very low "refresh rate", which means things can easily blur together. Smooth pursuit prevents that. However, modern sample-and-hold displays like LCD and OLED work against Smooth pursuit. If you watch anything moving on a screen (including "still" objects moving on screen due to camera movement), your eye will automatically track those objects if they are momentarily the focus of attention, which should make them be still relative to your eyes and thus appear sharp.
However, since the tracked object is being still relative to your eyes and the individual frames on screen are being still relative to your screen, the frames move (are not being still) relative to your eyes. Which means they appear blurry during smooth pursuit, when in reality they should be perfectly sharp.
For example, your eyes track a sign that moves on the screen due to camera movement. Say it moves 10 pixels per frame horizontally. This means you will see a 10 pixel wide horizontal blur on this sign. Which could make it unreadable. In reality (without screen with a real sign) the sign would appear perfectly clear.
On CRT screens this doesn't happen (to the same extent) because the frame is not displayed for the entire frame time (e.g. 1/60th of a second) but much shorter. The CRT just very quickly flashes the frames and is dark in between. Strobing/flickering basically. So if the tracked object moves 10 pixels per frame, the frame might only be (say) visible for 1/5th of that frame time, which means it moves only 2 pixel while the frame is actually on screen. So you get only 2 pixel blur, which is much less.
Of course at 60 FPS you might instead get some degree of perceptible flicker (computer CRTs therefore often ran higher than 60) and in general the overall achievable screen brightness will be darker, since the screen is black most of each frame time. CRTs had a low maximum brightness. But they had very little of the "persistence blur" which plagues sample-and-hold screens like OLED and LCD.
The motion blur intentionally introduced by video games is there to make moving objects appear smoother that are not tracked by our eyes. In that case motion blur is natural (since smooth pursuit doesn't try to remove it). So some forms of motion blur are undesirable and others are desirable.
The optimal solution would be to run games (and videos content in general) at an extremely high frame rate (like 1000 FPS) which would introduce natural perceptible motion blur where it naturally occurs and remove it where it doesn't naturally occur (during smooth pursuit). But obviously that would be computationally an extremely inefficient way to render games.
By the way, if you have a screen with 120+ Hz you can test the above via this black frame insertion demo, which emulates how CRTs work:
https://testufo.com/blackframes
On my 120 Hz OLED screen, the 40 FPS (1 frame + 2 black frames) UFO looks as clear as the native 120 Hz UFO. A real 60 or even 80 Hz CRT screen would be even better in terms of motion clarity. Perhaps better than a 240 or even 480 Hz OLED.
Yeah they are two different effects. Theres motion blur on individual objects that you want (as human eyes see/have) then there is full screen motion blur that is due to the display technology (lcd,oled etc) that you dont want (as human eyes dont see/have). CRTs dont have this motion blur as the screen is blank most of the time - see slo mo guys on youtube for crt displays.
Because I hate it.
We need display manufacturers to provide a refresh cycle that is agnostic of the incoming signal hz sent down the cable AND to either provide shader support (ideally) at the displays hz OR to implement this shader.
There really is no need for an expensive RetroTink if we had this. Some manufacturer must be able to do it and the rest would follow.
With about half of the screen is black, can it also boost FPS by not spending GPU time on pixels in those areas if integrated deep into engine?
I don't think this is how it works.
The technique is for when you have X fps content and Y fps capable monitor, where Y > X. In games, you'll still render at your old FPS cap, but this shader is for relatively cheaply generating extra frames that will make the content look smoother / sharper.
You could definitely do this, but a lot of modern rendering techniques rely on having full copies of previous frames lying around, like TXAA, screen-space reflections, etc.
The images are briefly persisted and averaged in the back of the viewer's eye.
Does all this shader really get around is the problem of display inversion?
From Gemini: "Display inversion is the process of alternating the voltage between positive and negative for each pixel on an LCD screen to prevent damage. This process is called polarity inversion."
If display manufacturers knocked that on the head for certain scenarios then surely we could just have a simple block of horizontal screen scrolling down the display at high refresh rates?
Phosphor fall off as far as can be seen in Slo Mo Guys is quite a small effect not on the scale of this shader.
Past discussion: https://news.ycombinator.com/item?id=42506211
Does this mean that the original duck hunt gun might work again?
There’s a really interesting discussion of precisely this in the comments under the article! Recommended. Might have to dig to see it.
I don’t see any comments under the article. Maybe have to be logged in?
It’s on the article before this one. The tldr is that no, this doesn’t reduce latency so there’s no chance of making the original light guns work without modifying either them or the game.
I'd buy a new TV or monitor for this feature alone.
Cycle refresh shaders where someting my last team really nailed. The key challenge was during the day there is a lot of sun. Adjustments can be made to the location, it really pays dividents.
If this is proven to work for 99% people and high refresh displays become cheap GPUs could optimize for rendering just a sliver of screen at a time.