The problem is that it’s being used to not optimize, when it should be to prolong the lifespan of computers, mostly older gaming rigs. If developers focused on optimizing and not on rushing things, a GTX 1080 Ti could probably handle AAA games at 1440p, high settings, at least at 60 FPS, and 140+ FPS with DLSS at quality. Keep in mind that I don’t blame most developers, but rather big corps, that do have partnerships with companies like Nvidia, that obviously want people constantly buying their GPUs.
Yeah, I really like DLSS/FSR/etc for letting newer games run on old systems. But I don’t feel like it should ever be necessary for modern hardware to run it well.
Ray tracing in general is a big culprit in this, it has such a high performance hit. That was fine back when Ray tracing was optional, but we’re increasingly seeing games with mandatory ray tracing now. Indiana Jones and the upcoming Doom The Dark Ages requiring it for lighting is a mistake imo, not something that computer hardware in general is really ready to be a default.
Ray Tracing is useless (unless it’s for animated movies or movies that use CGI), regular lighting is a lot better for performance, and it’s 80% as good as Ray Tracing, in comparison. I use a really bad laptop, yet it is possible to get 30 to 60 FPS, on decently optimized games.
Agreed, the industry has lots of tricks for doing authentic looking lighting and reflection, that can be done at a fraction of the performance impact. One day we’ll be at a point where hardware raytracing makes sense, but I don’t think we’re there yet.
Nonetheless, I think that it is possible to modificate these cards, to have an upscaling chip inside it. But it would take some effort, which no company will ever do.
I was gonna say my 1660 Super is still able to do that in most modern games without DLSS (or FSR). In fact, most of the time turning on the AI upscaling makes things run worse and I don’t even understand that. But like, two games that release in the same month and one runs great maxed out while another putters along at 30-40 on low settings with the upscaling off, despite both being on the same engine, tells me that one of them is using DLSS/FSR as a crutch.
most of the time turning on the AI upscaling makes things run worse and I don’t even understand that
My understanding is that DLSS/FSR are usually converting GPU load into a lesser CPU load. But if you’re already bottlenecked by your CPU, using the upscalers will hurt your performance instead.
I got a Ryzen 5 3600. It shouldn’t be bottlenecked. But also, the games where it does make things worse run absolutely perfect without it, and the ones that work better with it on run like ass without it, so I had been assuming that maybe it was messing things up because I really don’t even need it. 🤷🏻♂️
The problem is that it’s being used to not optimize, when it should be to prolong the lifespan of computers, mostly older gaming rigs. If developers focused on optimizing and not on rushing things, a GTX 1080 Ti could probably handle AAA games at 1440p, high settings, at least at 60 FPS, and 140+ FPS with DLSS at quality. Keep in mind that I don’t blame most developers, but rather big corps, that do have partnerships with companies like Nvidia, that obviously want people constantly buying their GPUs.
Yeah, I really like DLSS/FSR/etc for letting newer games run on old systems. But I don’t feel like it should ever be necessary for modern hardware to run it well.
Ray tracing in general is a big culprit in this, it has such a high performance hit. That was fine back when Ray tracing was optional, but we’re increasingly seeing games with mandatory ray tracing now. Indiana Jones and the upcoming Doom The Dark Ages requiring it for lighting is a mistake imo, not something that computer hardware in general is really ready to be a default.
Ray Tracing is useless (unless it’s for animated movies or movies that use CGI), regular lighting is a lot better for performance, and it’s 80% as good as Ray Tracing, in comparison. I use a really bad laptop, yet it is possible to get 30 to 60 FPS, on decently optimized games.
Agreed, the industry has lots of tricks for doing authentic looking lighting and reflection, that can be done at a fraction of the performance impact. One day we’ll be at a point where hardware raytracing makes sense, but I don’t think we’re there yet.
GTX cards don’t have the hardware to do DLSS though, so unfortunately this is impossible.
Nonetheless, I think that it is possible to modificate these cards, to have an upscaling chip inside it. But it would take some effort, which no company will ever do.
I was gonna say my 1660 Super is still able to do that in most modern games without DLSS (or FSR). In fact, most of the time turning on the AI upscaling makes things run worse and I don’t even understand that. But like, two games that release in the same month and one runs great maxed out while another putters along at 30-40 on low settings with the upscaling off, despite both being on the same engine, tells me that one of them is using DLSS/FSR as a crutch.
My understanding is that DLSS/FSR are usually converting GPU load into a lesser CPU load. But if you’re already bottlenecked by your CPU, using the upscalers will hurt your performance instead.
I got a Ryzen 5 3600. It shouldn’t be bottlenecked. But also, the games where it does make things worse run absolutely perfect without it, and the ones that work better with it on run like ass without it, so I had been assuming that maybe it was messing things up because I really don’t even need it. 🤷🏻♂️