Sounds like it’s time to switch out the 1080ti for a 9070xt. Been almost 10 years, probably due for an upgrade.
I will miss having that CUDA compatibility on hand for matlab tinkering. I wonder if any translation layers are working yet?
The last time I updated my driver, BG3 didn’t start anymore. So I really could not care less about driver updates for my 8 years old card.
But still, fuck nvidia.
According to the Steam HW survey around 6% of users are still using Pascal (10xx) GPUs. That’s about 8.4 million GPUs losing proprietary driver support. What a waste.
GPU % 1060 1.86 1050ti 1.43 1070 0.78 1050ti 0.67 1080 0.5 1080ti 0.38 1070ti 0.24Doubly evil given that GPU prices are still ridiculous.
Interesting, I’m about to move one more machine to Linux (the one that’s been off for a while) and I’ve got exactly 10xx GPU inside lol.
Nvidia was awful before the LLM craze, now they’re awful AND evil.
Fuck, what do I do when they inevitably discontinue support for 20xx? Just cry and accept that I no longer have a computer, as every component costs as much as a house? D:
Start watching the second hand market. Most of my PC components are bought second hand, and at much cheaper than buying any of those components new.
None of these components are of course bleeding edge, but still sufficient to play any game I want.
I bought an AMD Radeon RX 5700 XT this summer for 1000 DKK (~€133 or ~$157).
I wasted days of my life getting nVidia to work on Linux. Too much stress. Screw that. Better ways to spend time. If I can’t game, that’s OK too.
I switched from a 3080 to a 7900 xt. It’s one of the better decisions I’ve made even though on paper the performances are not too far away.
I’m told AMD works better with Linux, but I haven’t tried it myself.
AMD is plug and play on Linux. With my 7800XT there isn’t a driver to install. Only issue is that AMD doesn’t make anything that competes with the 5080/5090.
AMD is and has been much more friendly towards linux than nivdia. I run mine in proxmox passing through to linux and windows gaming VMs. AMD has invested in open source drivers.
https://thetechylife.com/does-amd-support-linux/
https://arstechnica.com/information-technology/2012/06/linus-torvalds-says-f-k-you-to-nvidia/

Open source drivers are a major plus, I’ve had a much easier time than my partner on NVIDIA. I mean I make both machines work but the NVIDIA has been a real pig at times.
I’ve had so many problems with Nvidia GPUs on Linux over the years that I now refuse to buy anything Nvidia. AMD cards work flawlessly and get very long-term support.
Same. Refuse to use NVIDIA going forward for anything.
I’m with you, I know we’ve had a lot of recent Linux converts, but I don’t get why so many who’ve used Linux for years still buy Nvidia.
Like yeah, there’s going to be some cool stuff, but it’s going to be clunky and temporary.
Even now, CUDA is gold standard for data science / ML / AI related research and development. AMD is slowly brining around their ROCm platform, and Vulcan is gaining steam in that area. I’d love to ditch my nvidia cards and go exclusively AMD but nvidia supporting CUDA on consumer cards was a seriously smart move that AMD needs to catch up with.
Sorry for prying for details, but why exactly do you need NVIDIA?
When people switch to Linux they don’t do a lot of research beforehand. I, for one, didn’t know that Nvidia doesn’t work well with it until I had been using it for years.
To be fair, Nvidia supports their newer GPUs well enough, so you may not have any problems for a while. But once they decide to end support for a product line, it’s basically a death sentence for that hardware. That’s what happened to me recently with the 470 driver. Older GPU worked fine until a kernel update broke the driver. There’s nobody fixing it anymore, and they won’t open-source even obsolete drivers.
I JUST ran into this issue myself. I’m running Proxmox on an old Laptop and wanted to use its 750M…. Which is one of those legacy cards now that I guess means I’d need to downgrade the kernel to use?
I’m not knowledgeable enough to know the risks or work I’d be looking at to get it working so for now, it’s on hiatus.
You might be able to use the Nouveau driver with the 750M. Performance won’t be great, but might be sufficient if it’s just for server admin.
It’s a good way for people to learn about fully hostile companies to the linux ecosystem.
Similar for me. All the talk about what software Linux couldn’t handle, I didn’t learn that Linux is incompatible with Nvidia until AFTER I updated my GPU. I don’t want to buy another GPU after less than a year, but Windows makes me want to do a sudoku in protest… but also my work and design software wont run properly on Linux and all anybody can talk about is browsers and games.
I’m damned whether I switch or not.
Linux hates Nvidia
got that backwards
My point is they dont work together. I can believe Nvidia ‘started’ it, but it doesnt matter or help me solve my problem. I’ve decided I want to try Linux but I can’t afford another card so I’m doing what I can.
Linus openly hated Nvidia, but I suspect Nvidia started it
If you only suspect then you never heard the entire quote and only know the memes.
You somehow still learned wrong, and I don’t understand how any of that happened. Nvidia not working well with Linux is so widely known and talked about, I knew about it, and the actual reason (which is the reverse of what you think), for several years before switching. I feel like you must have never tried to look anything up, spent any time in a place like lemmy or any forums with a Linux focus and basically must have decided to and kept yourself in some bubble of ignorance and no connection to learn anything.
This is an uncharitable interpretation of what I said.
Nvidia doesn’t tell me it doesn’t work. Linux users do. When I first used Linux for coding all those years ago, my GPU wasn’t relevant, nobody mentioned it during my code bootcamp or computer science certification several years ago, and ubuntu and Kubuntu both booted fine.
When I upgraded my GPU, I got Nvidia. It was available and I knew what to expect. Simple as.
Then as W10 (and W11) got increasingly intolerable, I came to Linux communities to learn about using Linux as a Windows replcement, looking into distros like Mint and Garuda, and behold: I come across users saying Linux has compatibility issues with Nvidia. Perhaps because it is ‘so well known’ most don’t think to mention it, I learned about it from a random comment on a meme about gaming.
I also looked into tutorials on getting Affinity design software to work on which distros, and the best I could find was shit like, I finally got it to run so long as I don’t [do these three basic functions].
I don’t care who started it, I can already believe it’s the for-profit company sucking up to genAI. But right now that doesn’t help me. I care that it’s true and that’s the card I have, and I’m still searching for distros that will let me switch and meets work needs and not just browsing or games.
I’m here now, aware that they don’t work, still looking for the best solution I can afford, because I did look up Linux.
People buy Nvidia for different reasons, but not everyone faces any issues with it in Linux, and so they see no reason to change what they’re already familiar with.
I just replaced my old 1060 with a Radeon 6600 rx myself.
Sadly GPU passthrough only worked on Nvidia cards when I was setting up my server, so I had to get one of them :(
Yeah, I stopped using Nvidia like 20 years ago. I think my last Nvidia card may have been a GeForce MX, then I switched to a Matrox card for a time before landing on ATI/AMD.
Back then AMD was only just starting their open source driver efforts so the “good” driver was still proprietary, but I stuck with them to support their efforts with my wallet. I’m glad I did because it’s been well over a decade since I had any GPU issues, and I no longer stress about whether the hardware I buy is going to work or not (so long as the Kernel is up to date).
I had an old NVidia gtx 970 on my previous machine when I switched to Linux and it was the source of 95% of my problems.
It died earlier this year so I finally upgraded to a new machine and put an Intel Arc B580 in it as a stop gap in hopes that video cards prices would regain some sanity eventually in a year or two. No problems whatsoever with it since then.
Now that AI is about to ruin the GPU market again I decided to bite the bullet and get myself an AMD RX 9070 XT before the prices go through the roof. I ain’t touching NVidia’s cards with a 10 foot pole. I might be able to sell my B580 for the same price I originally bought it for in a few months.
That’s fine and dandy until you need to do ML, there is no other option
I successfully ran local Llama with llama.cpp and an old AMD GPU. I’m not sure why you think there’s no other option.
Amd had approximately 1 consumer gpu with rocm support so unless your framework supports opencl or you want to fuck around with unsupported rocm drivers then you’re out of luck. They’ve completely failed to meet the market
Llama.cpp now supports Vulkan, so it doesn’t matter what card you’re using.
Devs need actual support, tensor and other frameworks
I mean… my 6700xt dont have offical rocm support, but the rocm driver works perfectly fine for it. The difference is amd hasnt but the effort in testing rocm on their consumer cards, thus cant make claims support for it.
ML?
Machine learning
Is that a huge deal for the average user?
Here is old man me trying to fogure out what PASCAL code there is in the linux codebase, and how NVIDIA gets to drop it.
Man that was my first thought and I didn’t even use it 😂
Anything that starts with a 10 according to the very first sentence in the article.
Same- Pascal was the first coding language I learned in high school. I was confused here.
He tried to warn y’all…

I can’t believe they would do this to poor Borland. I guess I’ll just need to use an AMD GPU for my Turbo Pascal fun.
Those are the GPUs they were selling — and a whole lot of people were buying — until about five years ago. Not something you’d expect to suddenly be unsupported. I guess Nvidia must be going broke or something, they can’t even afford to maintain their driver software any more.
I don’t get what needs support, exactly. Maybe I’m not yet fully awake, which tends to make me stupid. But the graphics card doesn’t change. The driver translates OS commands to GPU commands, so if the target is not moving, changes can only be forced by changes to the OS, which puts the responsibility on the Kernel devs. What am I missing?
The driver needs to interface with the OS kernel which does change, so the driver needs updates. The old Nvidia driver is not open source or free software, so nobody other than Nvidia themselves can practically or legally do it. Nvidia could of course change that if they don’t want to do even the bare minimum of maintenance.
The driver needs to interface with the OS kernel which does change, so the driver needs updates.
That’s a false implication. The OS just needs to keep the interface to the kernel stable, just like it has to with every other piece of hardware or software. You don’t just double the current you send over USB and expect cable manufacturers to adapt. As the consumer of the API (which the driver is from the kernel’s point of view) you deal with what you get and don’t make demands to the API provider.
Device drivers are not like other software in at least one important way: They have access to and depend on kernel internals which are not visible to applications, and they need to be rebuilt when those change. Something as huge and complicated as a GPU driver depends on quite a lot of them. The kernel does not provide a stable binary interface for drivers so they will frequently need to be recompiled to work with new versions of linux, and then less frequently the source code also needs modification as things are changed, added to, and improved.
This is not unique to Linux, it’s pretty normal. But it is a deliberate choice that its developers made, and people generally seem to think it was a good one.
They have access to and depend on kernel internals
That sounds like a stupid idea to me. But what do I know? I live in the ivory tower of application development where APIs are well-defined and stable.
Thanks for explaining.
You’re re-opening the microkernel vs monlithic kernel debate with that. For fun you can read how Andrew S. Tanenbaum and Linus Torvalds debated the question in 1992 here: https://groups.google.com/g/comp.os.minix/c/wlhw16QWltI
I don’t generally disagree, but
You don’t just double the current you send over USB and expect cable manufacturers to adapt
That’s pretty much how we got to the point where USB is the universal charging standard: by progressively pushing the allowed current from the initially standardized 100 mA all the way to 5 A of today. A few of those pushes were just manufacturers winging it and pushing/pulling significantly more current than what was standardized, assuming the other side will adapt.
The default standard power limit is still the same as it ever was on each USB version. There’s negotiation that needs to happen to tell the device how much power is allowed, and if you go over, I think over current protection is part of the USB spec for safety reasons. There’s a bunch of different protocols, but USB always starts at 5V, and 0.1A for USB 2.0, and devices need to negotiate for more. (0.15A I think for USB 3.0 which has more conductors)
As an example, USB 2.0 can signal a charging port (5V / 1.5A max) by putting a 200 ohm resistor across the data pins.
The default standard power limit is still the same as it ever was on each USB version
Nah, the default power limit started with 100 mA or 500 mA for “high power devices”. There are very few devices out there today that limit the current to that amount.
It all begun with non-spec host ports which just pushed however much current the circuitry could muster, rather than just the required 500 mA. Some had a proprietary way to signal just how much they’re willing to push (this is why iPhones used to be very fussy about the charger you plug them in to), but most cheapy ones didn’t. Then all the device manufacturers started pulling as much current as the host would provide, rather than limiting to 500 mA. USB-BC was mostly an attempt to standardize some of the existing usage, and USB-PD came much later.
A USB host providing more current than the device supports isn’t an issue though. A USB device simply won’t draw more than it needs. There’s no danger of dumping 5A into your 20 year old mouse because it defaults to a low power 100mA device. Even if the port can supply 10A / 5V or something silly, the current is limited by the voltage and load (the mouse).
The hardware doesn’t change, but the OS do.
Using 10 year old hardware with 10 year old drivers on 10 year old OS require no further work.
Pascal is coming up on 10 years old. You can’t expect companies to support things forever.
If they’re going to release things under a proprietary license and send lawyers after individuals just trying to get their hardware to work, then yes, yes I can.
Don’t want to support it anymore? Fine. Open source it and let the community take over.
They started 9 years ago, but they remained popular into 2020 and according to wikipedia the last new pascal model was released in 2022. The 1080 and the 1060 are both still pretty high up on the Steam list of the most common GPUs.
What model came out in 2022? The newest I could find was the GT 1010 from 2021 (which is more of a video adapter than an actual graphics card) but that’s the exception. The bulk of them came out in 2016 and 2017 https://www.techpowerup.com/gpu-specs/?f=architecture_Pascal
Hate to break it to ya, but 2020 was 5 years ago. More than half of these GPUs lifespan ago. Nvidia is a for profit company, not your friend. You can’t expect them to support every single product they’ve ever released forever. And they’re still doing better than AMD in that regard.
You can’t expect them to support every single product they’ve ever released forever. And they’re still doing better than AMD in that regard.
If nvidia had the pre-GSP cards’ drivers opensourced at least there would be a chance of maintaining support. But nvidia pulled the plug.
Intel’s and AMD’s drivers in the Mesa project will continue to receive support.
For example, just this week: Phoronix: Linux 6.19’s Significant ~30% Performance Boost For Old AMD Radeon GPUs These are GCN1 GPUs from 13yrs ago.
thanks to work by Valve
AMD did nothing to make their drivers better, Vale did.
AMD did nothing to make their drivers better, Vale did.
That’s literally the point of open source though, both AMD and Intel rely on open source drivers, so anybody can fix a flaw they encounter without having to rely on the company to “consider allocating resources towards a fix for legacy hardware”
Making them open to contributions was the first step, but ok I won’t engage in this petty tribalism.
The topic was about nvidia’s closed source drives.
Valve couldn’t do the same for pascal GPUs. Nobody but nvidia has the reclocking firmware, so even the reverse engineered nouveau NVK drivers are stuck at boot clock speeds.
That’s why I don’t like closed source proprietary. They decide to stop the support.
Just make 580 legacy. Problem solved. :) (It’s already LTS though)
By the way, I have GTX 1080 and this seems to be end of the way. What is the similar AMD I can get, preferably second hand?
Maybe the RX 7600 XT.
It’s in the second to latest generation (7000 not 9000), should be slightly faster than a GTX 1080, and doubles the VRAM capacity to 16 GB so you wouldn’t be in danger of running into limitations with that too soon.
Sounds nice but let me rephrase. What is the similar AMD card that I can get by selling GTX 1080 without adding too much. I’m not looking for an upgrade anytime soon.
I’ll switch to Debian if I must but I don’t want to do that either unless I have to.
Hard to say what the used market is like, but the cheapest cards that would be broadly similar in performance would probably be the Arc A580, RX 5700 or RX 6600. This page has some rankings that are reasonable for comparison: https://www.tomshardware.com/reviews/gpu-hierarchy,4388-2.html
But…surely there’s a way to just stick with the latest supported driver, right? Or is Arch truly an “upgrade or die” distro?
Thanks for the comparison. It seems RX 6600 seems the most suitable option since the second hand prices for 5700 and 6600 are quite close. Probably GTX 1080 can compensate most of it.
Well, currently I’ll have to switch the AUR driver and will have to stick to it for a while until I find a good candidate. I would expect them to make it at least legacy driver so we don’t have to hoop around though. It’s not usually like this but this is a big change (thanks Nvidia) and Arch team made a big change too, so usual update would break my system it would seem.
No idea, since I have no way of knowing how the second hand market around you looks like. I just looked for a similar (in performance) card, not of the newest gen so there would hopefully be some used models around.
I see, thanks for checking though. It seems I can sell GTX 1080 around 100$ here but RX 7600 XT is 320$ in second hand. Second hand market doesn’t seem bright here. Even RX 6600 is around 150$. That might work for me but I guess I’ll need to check a lot.
“Brodie” mentioned. To be fair on the Arch side, they are clear the system could break with an update and you should always read the Arch news in case of manual intervention. You can’t fault Archlinux for users not following the instructions. This is pretty much what Arch stands for.
And IMO if anything this is Nvidia’s doing, arch is just being arch, like it sucks but I also don’t see a problem with arch in this instance.
Brodie
Thinking Forth was a great book! I’m surprised it came up here though.
Gnawh Penisburg. I’m still sporting a laptop with a 1070 from about a decade ago…
My son was going to switch to Linux this week. He has a GTX 1060.
He just needs to stay on the 580 driver. Bazzite is handling that transparently and wont update you to the 590 driver if you have an unsupported GPU.
Then next time round, buy an AMD or Intel GPU. They tend tp treat their customers better.
Nice, maybe bazzite it is then.
Nouveau might be good enough by now for most games that will run on a 1060, maybe worth a try.
AFAIK they still don’t support reclocking on anything older than Turing, meaning the GPU is stuck at the lowest clock frequency and therefore runs very slowly.
Am I the only one that can’t manage to make their Nvidia GTX 1060 run correctly on Linux? It has way worse performance than on Windows, even with the proprietary drivers.
I’ve tried both Kubuntu and Linux Mint.
I guess he can’t say he uses arch btw
It’s not that bad as I understand it. If you are using arch with a Pascal GPU you can switch to the legacy proprietary branch. https://archlinux.org/news/nvidia-590-driver-drops-pascal-support-main-packages-switch-to-open-kernel-modules/
He wants to use Mint. This is what is called planned obsolescence. I say what Linus Torvalds says.
I have a 1050 Ti running the 580 driver under Linux Mint; it works fine.
Might be able to use Mint Debian Edition.
deleted by creator



















