50
Temporal anti-aliasing: a blessing or a curse?
(www.eurogamer.net)
Video game news oriented community. No NanoUFO is not a bot :)
Posts.
Comments.
My goal is just to have a community where people can go and see what new game news is out for the day and comment on it.
Other communities:
Antialiasing is a byproduct of moving away from CRT display technology. The natural image softening in CRT tech is not replicated in LCD and LED displays.
TAA is one of the better options, but at the end of the day it will be difficult to create a true AA solution that doesnt have artifacts, without utilizing supersampling.
We used AA on our CRTs back in the day. Of course we were all running like 1024x768 as the resolution so it was a lot more needed. The higher your resolution the less you need it.
Yes, thats true. AA was helpful at certain resolutions that were what I call "medium resolutions", the range between 480 and 768 pixels. But CRTs still had a softer image simply as a byproduct of the way the technology worked, and worked better at lower resolutions like 240p (AFAIK, any signal less than 480 vertical pixel resolution was automatically progressive scan). This was abused and exploited by game developers of the time, famously utilizing dithering for transparency effects for platforms that didn't fully support it such as the SEGA Saturn (it only supported transparent 2D sprites, but not textured polygons like the PSX did). The softer image led to the dithered effects smoothing out, giving the appearance of a bigger available color palette and special effects. Flickering sprites every other field was also a common technique due to CRTs high image persistence. This is why games like Streets of Rage look awful on modern displays, but display correctly on CRTs.
But regardless, AA will probably be phased out eventually, its just a tool to mitigate growing pains of new display technology.
Interesting take. Do you think that natural image softening would come back in newer technologies?
I'm not that guy, but I don't think so. The trend will likely be that we get to the point where we render and display in such a high resolution that you can't even see pixels anymore. We're getting there already with smaller 4k displays where turning on AA doesn't have an appreciable difference in 4k native rendering.
I agree with this. Outside of some media that may release with special effects designed to mimic the softer image of a CRT, I think display technology will just progress to the point where nothing will use AA at all because the resolution is just too high to really tell. I mean, its already like that with 4k TVs, you sit far away enough that you usually can't tell the difference between 4k and 1080p.
DLAA comes to mind