Oh, haha I was confused for a second about the Pascal language 🤭
Getting dumped to CLI is just a standard Arch experience in updating anything isn’t it? You asked for it, you got it.
That newer open source driver is still far behind but is progressing. Those graphics cards will have a great new life with modern kernels someday
except for the no reclocking thing, which cripples them
According to the Steam HW survey around 6% of users are still using Pascal (10xx) GPUs. That’s about 8.4 million GPUs losing proprietary driver support. What a waste.
GPU % 1060 1.86 1050ti 1.43 1070 0.78 1050 0.67 1080 0.5 1080ti 0.38 1070ti 0.24Fixed: 1050 was noted as 1050ti
Doubly evil given that GPU prices are still ridiculous.
8.4 million GPUs losing proprietary driver support.
Are they all on Linux though?
Are they supported longer on the windows driver?
Windows doesn’t force update your driver and remove support though, and even if it did it won’t drop you to some CLI, it will still work.
Rolling distros also only update when you tell them. It is the user who is pulling the trigger on the footgun in both cases.
I’d say the main difference is that arch users are more trigger-happy about being up to date.
Also, I think pacman should at least warn you if the problem is enough to warrant a post on the arch website.
I think pacman should at least warn you if the problem is enough to warrant a post on the arch website.
It will, if you install the informant package from the aur or the chaotic-aur unofficial repo.
Otherwise you can follow the advice in the wiki System maintenance page, which says to read the home page, or news RSS feed, or arch-announce mailing list before upgrading.
Sorta, but you run one command to update everything at once, and even though the system knows what GPU you have it still seems to update the driver to one thats not compatible, instead of holding that update back.
Also if it didn’t warn the user when updating, the user had no idea they were pulling any trigger, especially when Linux falls back to CLI after this instead of just falling back to a basic driver.
What you described is what happaned with arch. The transitioning shouldn’t have happened this way, IMO.
Other distros usually don’t send their users to TTY after an update if they can help it.
On the long term, the situation is the same on linux and windows: you choose the latest driver and live with that given feature set and its bugs.
Apparently? Title only mentions dropping the support on Linux. 🤷♂️
You don’t have to updare your drivers though, isn’t this normal with older hardware?
You don’t have to updare your drivers though.
Not sure if you’re on Windows or Linux but, on Linux, we have to actively take explicit actions not to upgrade something when we are upgrading the rest of our system. It takes more or less significant effort to prevent upgrading a specific package, especially when it comes in a sneaky way like this that is hard to judge by the version number alone.
On Windows you’d be in a situation like “oh, I forgot to update the drivers for three years, well that was lucky.”
It makes me wonder why the package still auto updates if it detects you’re using the driver that would be removed, surely it could do some checks first?
Would be vastly preferable to it just breaking the system.
It would be a very out-of-scope feature for a Linux package manager to do a GPU hardware check and kernel module use check to compare whether you’re using the installed driver, and then somehow detect in the downloaded, about-to-be-installed binary that this will indeed remove support for your hardware.
It just seems very difficult to begin with, but especially not the responsibility of a general package manager as found on Linux.
On Windows, surely the Nvidia software should perform this detection and prevent the upgrade. That would be its responsibility. But it’s just not how it is done on Linux.
It’s not the package itself that “auto updates”. The package manager just updates all the packages that have updates available, that’s it.
But still, the system doesn’t really “break”, all you have to do is downgrade the package, then add a rule preventing it from being updated until Nvidia/Arch package maintainers add a new package that has only that legacy driver’# latest version, which won’t be upgraded again.
I believe the same SW version is packaged. Nvidia said they’d drop support in the 580 release, but they shifted it to 590 now.
The arch issues are another layer of headache by the maintainers changing the package names and people breaking their systems on update when a non-compatible version is pulled replacing the one with still pascal support in it.
Not really a problem of Arch, but of the driver release model, then, IMO. You’d have this issue on Windows too if you just upgraded blindly, right? It’s Nvidia’s fault for not naming their drivers, or versioning/naming them in a way that indicates support for a set of architectures. Not just an incrementing number willy nilly.
It’s 2025, can we not display a warning message in pacman? Or letting it switch from nvidia-590 to nvidia-legacy?
I’m not an arch user, I admit, I don’t like footguns.
TIL Arch is a footgun. 🤡 cope. 😉
But yeah, I agree, if package maintainers were astute there, a warning would’ve probably been good somehow. Not sure pacman supports pre-install warnings. Maybe? It does support warning about installing a renamed/moved package. But the naming would’ve had to be really weird for everyone involved if the warning would be clear in that case.
Windows doesnt drop to CLI and break if the graphics driver is missing. But also GPU driver updates are not forced on you just by updating the system.
Interesting, I’m about to move one more machine to Linux (the one that’s been off for a while) and I’ve got exactly 10xx GPU inside lol.
Sounds like it’s time to switch out the 1080ti for a 9070xt. Been almost 10 years, probably due for an upgrade.
I will miss having that CUDA compatibility on hand for matlab tinkering. I wonder if any translation layers are working yet?
https://github.com/vosen/ZLUDA I’ve heard is doing pretty well
Looks cool, thanks for the link. I’ll give it a go.
The last time I updated my driver, BG3 didn’t start anymore. So I really could not care less about driver updates for my 8 years old card.
But still, fuck nvidia.
Nvidia was awful before the LLM craze, now they’re awful AND evil.
I’ve had so many problems with Nvidia GPUs on Linux over the years that I now refuse to buy anything Nvidia. AMD cards work flawlessly and get very long-term support.
I’m with you, I know we’ve had a lot of recent Linux converts, but I don’t get why so many who’ve used Linux for years still buy Nvidia.
Like yeah, there’s going to be some cool stuff, but it’s going to be clunky and temporary.
Even now, CUDA is gold standard for data science / ML / AI related research and development. AMD is slowly brining around their ROCm platform, and Vulcan is gaining steam in that area. I’d love to ditch my nvidia cards and go exclusively AMD but nvidia supporting CUDA on consumer cards was a seriously smart move that AMD needs to catch up with.
Sorry for prying for details, but why exactly do you need NVIDIA?
CUDA is an Nvidia technology and they’ve gone out of their way to make it difficult for a competitor to come up with a compatible implementation. With cross-vendor alternatives like OpenCL and compute shaders, they’ve not put resources into achieving performance parity, so if you write something in both CUDA and OpenCL, and run them both on an Nvidia card, the CUDA-based implementation will go way faster. Most projects prioritise the need to go fast above the need to work on hardware from more than one vendor. Fifteen years ago, an OpenCL-based compute application would run faster on an AMD card than a CUDA-based one would run on an Nvidia card, even if the Nvidia card was a chunk faster in gaming, so it’s not that CUDA’s inherently loads faster. That didn’t give AMD a huge advantage in market share as not very much was going on that cared significantly about GPU compute.
Also, Nvidia have put a lot of resources over the last fifteen years into adding CUDA support to other people’s projects, so when things did start springing up that needed GPU compute, a lot of them already worked on Nvidia cards.
When people switch to Linux they don’t do a lot of research beforehand. I, for one, didn’t know that Nvidia doesn’t work well with it until I had been using it for years.
To be fair, Nvidia supports their newer GPUs well enough, so you may not have any problems for a while. But once they decide to end support for a product line, it’s basically a death sentence for that hardware. That’s what happened to me recently with the 470 driver. Older GPU worked fine until a kernel update broke the driver. There’s nobody fixing it anymore, and they won’t open-source even obsolete drivers.
I JUST ran into this issue myself. I’m running Proxmox on an old Laptop and wanted to use its 750M…. Which is one of those legacy cards now that I guess means I’d need to downgrade the kernel to use?
I’m not knowledgeable enough to know the risks or work I’d be looking at to get it working so for now, it’s on hiatus.
You might be able to use the Nouveau driver with the 750M. Performance won’t be great, but might be sufficient if it’s just for server admin.
It’s a good way for people to learn about fully hostile companies to the linux ecosystem.
Similar for me. All the talk about what software Linux couldn’t handle, I didn’t learn that Linux is incompatible with Nvidia until AFTER I updated my GPU. I don’t want to buy another GPU after less than a year, but Windows makes me want to do a sudoku in protest… but also my work and design software wont run properly on Linux and all anybody can talk about is browsers and games.
I’m damned whether I switch or not.
Linux hates Nvidia
got that backwards
Linus openly hated Nvidia, but I suspect Nvidia started it
If you only suspect then you never heard the entire quote and only know the memes.
My point is they dont work together. I can believe Nvidia ‘started’ it, but it doesnt matter or help me solve my problem. I’ve decided I want to try Linux but I can’t afford another card so I’m doing what I can.
You somehow still learned wrong, and I don’t understand how any of that happened. Nvidia not working well with Linux is so widely known and talked about, I knew about it, and the actual reason (which is the reverse of what you think), for several years before switching. I feel like you must have never tried to look anything up, spent any time in a place like lemmy or any forums with a Linux focus and basically must have decided to and kept yourself in some bubble of ignorance and no connection to learn anything.
This is a bad faith interpretation of what I said.
Nvidia doesn’t tell me it doesn’t work. Linux users do. When I first used Linux for coding all those years ago, my GPU wasn’t relevant, nobody mentioned it during my code bootcamp or computer science certification several years ago, and ubuntu and Kubuntu both booted fine.
When I upgraded my GPU, I got Nvidia. It was available and I knew what to expect. Simple as.
Then as W10 (and W11) got increasingly intolerable, I came to Linux communities to learn about using Linux as a Windows replcement, looking into distros like Mint and Garuda, and behold: I come across users saying Linux has compatibility issues with Nvidia. Perhaps because it is ‘so well known’ most don’t think to mention it, I learned about it from a random comment on a meme about gaming.
I also looked into tutorials on getting Affinity design software to work on which distros, and the best I could find was shit like, I finally got into run so long as I don’t [do these three basic functions].
I don’t care who started it, I can already believe it’s the for-profit company sucking genAI’s cock. But that’s not relevant to moving forward. I care that it’s true and that’s the card I have, and I’m still searching for distros that will let me switch and actually work with work needs and not just browsing or games.
And I’m here now, aware that they don’t work, still learning how to get around it, because I did look up Linux. You’ve still done absolutely nothing to help address my problem, but you have convinced me this might be the wrong place to learn how to.
Nvidia’s poor Linux support has been a thing for decades.
If at all, the situation has recently improved. And that only after high-profile Linux developers telling Nvidia to get their shit together.
People buy Nvidia for different reasons, but not everyone faces any issues with it in Linux, and so they see no reason to change what they’re already familiar with.
Same. Refuse to use NVIDIA going forward for anything.
I just replaced my old 1060 with a Radeon 6600 rx myself.
Yeah, I stopped using Nvidia like 20 years ago. I think my last Nvidia card may have been a GeForce MX, then I switched to a Matrox card for a time before landing on ATI/AMD.
Back then AMD was only just starting their open source driver efforts so the “good” driver was still proprietary, but I stuck with them to support their efforts with my wallet. I’m glad I did because it’s been well over a decade since I had any GPU issues, and I no longer stress about whether the hardware I buy is going to work or not (so long as the Kernel is up to date).
deleted by creator
I successfully ran local Llama with llama.cpp and an old AMD GPU. I’m not sure why you think there’s no other option.
deleted by creator
Llama.cpp now supports Vulkan, so it doesn’t matter what card you’re using.
deleted by creator
I mean… my 6700xt dont have offical rocm support, but the rocm driver works perfectly fine for it. The difference is amd hasnt but the effort in testing rocm on their consumer cards, thus cant make claims support for it.
deleted by creator
ML?
deleted by creator
Is that a huge deal for the average user?
deleted by creator
I had an old NVidia gtx 970 on my previous machine when I switched to Linux and it was the source of 95% of my problems.
It died earlier this year so I finally upgraded to a new machine and put an Intel Arc B580 in it as a stop gap in hopes that video cards prices would regain some sanity eventually in a year or two. No problems whatsoever with it since then.
Now that AI is about to ruin the GPU market again I decided to bite the bullet and get myself an AMD RX 9070 XT before the prices go through the roof. I ain’t touching NVidia’s cards with a 10 foot pole. I might be able to sell my B580 for the same price I originally bought it for in a few months.
Sadly GPU passthrough only worked on Nvidia cards when I was setting up my server, so I had to get one of them :(
That’s why I’m not on bleeding edge.
Here is old man me trying to fogure out what PASCAL code there is in the linux codebase, and how NVIDIA gets to drop it.
Man that was my first thought and I didn’t even use it 😂
Same- Pascal was the first coding language I learned in high school. I was confused here.
Anything that starts with a 10 according to the very first sentence in the article.
As a proud owner of a 10xx I can tell you I bought it in 2016 and I think it was about a year old at that point.
He tried to warn y’all…

whose this and what speaking/event was it, do you remember?
That’s Linus Torvalds, the guy who made the Linux kernel. I think this was some interview he did, but I’m not sure.
Fuck, what do I do when they inevitably discontinue support for 20xx? Just cry and accept that I no longer have a computer, as every component costs as much as a house? D:
Surely there’s a way to keep the older driver on Linux, its absurdly easy on Windows.
how? when the linux kernel looks at you funny if you even mention kernel interface stability within a 100km radius
Keep using it, you don’t need them to support it to keep using it. All old driver versions still exist.
this bubble will pop sooner or later, 2nd hand market about to get flooded
Start watching the second hand market. Most of my PC components are bought second hand, and at much cheaper than buying any of those components new.
None of these components are of course bleeding edge, but still sufficient to play any game I want.
I bought an AMD Radeon RX 5700 XT this summer for 1000 DKK (~€133 or ~$157).
Those are the GPUs they were selling — and a whole lot of people were buying — until about five years ago. Not something you’d expect to suddenly be unsupported. I guess Nvidia must be going broke or something, they can’t even afford to maintain their driver software any more.
Nvidia isn’t exactly broke…I thought they were the most valuable company in the world? Or the second, sometimes they trade places with Apple
Poor Nvidia… the AI bubble is going to burst, the gamer market has all kinds of reasons to hate them now, and all they’ll have to console themselves with is several trillion dollars.
I don’t get what needs support, exactly. Maybe I’m not yet fully awake, which tends to make me stupid. But the graphics card doesn’t change. The driver translates OS commands to GPU commands, so if the target is not moving, changes can only be forced by changes to the OS, which puts the responsibility on the Kernel devs. What am I missing?
The driver needs to interface with the OS kernel which does change, so the driver needs updates. The old Nvidia driver is not open source or free software, so nobody other than Nvidia themselves can practically or legally do it. Nvidia could of course change that if they don’t want to do even the bare minimum of maintenance.
The driver needs to interface with the OS kernel which does change, so the driver needs updates.
That’s a false implication. The OS just needs to keep the interface to the kernel stable, just like it has to with every other piece of hardware or software. You don’t just double the current you send over USB and expect cable manufacturers to adapt. As the consumer of the API (which the driver is from the kernel’s point of view) you deal with what you get and don’t make demands to the API provider.
People love to say Linux is great for old hardware. But not 10 series Nvidia cards apparently?
Device drivers are not like other software in at least one important way: They have access to and depend on kernel internals which are not visible to applications, and they need to be rebuilt when those change. Something as huge and complicated as a GPU driver depends on quite a lot of them. The kernel does not provide a stable binary interface for drivers so they will frequently need to be recompiled to work with new versions of linux, and then less frequently the source code also needs modification as things are changed, added to, and improved.
This is not unique to Linux, it’s pretty normal. But it is a deliberate choice that its developers made, and people generally seem to think it was a good one.
They have access to and depend on kernel internals
That sounds like a stupid idea to me. But what do I know? I live in the ivory tower of application development where APIs are well-defined and stable.
Thanks for explaining.
You’re re-opening the microkernel vs monlithic kernel debate with that. For fun you can read how Andrew S. Tanenbaum and Linus Torvalds debated the question in 1992 here: https://groups.google.com/g/comp.os.minix/c/wlhw16QWltI
I don’t generally disagree, but
You don’t just double the current you send over USB and expect cable manufacturers to adapt
That’s pretty much how we got to the point where USB is the universal charging standard: by progressively pushing the allowed current from the initially standardized 100 mA all the way to 5 A of today. A few of those pushes were just manufacturers winging it and pushing/pulling significantly more current than what was standardized, assuming the other side will adapt.
The default standard power limit is still the same as it ever was on each USB version. There’s negotiation that needs to happen to tell the device how much power is allowed, and if you go over, I think over current protection is part of the USB spec for safety reasons. There’s a bunch of different protocols, but USB always starts at 5V, and 0.1A for USB 2.0, and devices need to negotiate for more. (0.15A I think for USB 3.0 which has more conductors)
As an example, USB 2.0 can signal a charging port (5V / 1.5A max) by putting a 200 ohm resistor across the data pins.
The default standard power limit is still the same as it ever was on each USB version
Nah, the default power limit started with 100 mA or 500 mA for “high power devices”. There are very few devices out there today that limit the current to that amount.
It all begun with non-spec host ports which just pushed however much current the circuitry could muster, rather than just the required 500 mA. Some had a proprietary way to signal just how much they’re willing to push (this is why iPhones used to be very fussy about the charger you plug them in to), but most cheapy ones didn’t. Then all the device manufacturers started pulling as much current as the host would provide, rather than limiting to 500 mA. USB-BC was mostly an attempt to standardize some of the existing usage, and USB-PD came much later.
A USB host providing more current than the device supports isn’t an issue though. A USB device simply won’t draw more than it needs. There’s no danger of dumping 5A into your 20 year old mouse because it defaults to a low power 100mA device. Even if the port can supply 10A / 5V or something silly, the current is limited by the voltage and load (the mouse).
Using 10 year old hardware with 10 year old drivers on 10 year old OS require no further work.
The hardware doesn’t change, but the OS do.
Well it still worked until this update, so few week old OS and driver was also good. It’s Arch so expect it to break. It will probably be fixable, we are Linux users.
Pascal is coming up on 10 years old. You can’t expect companies to support things forever.
Make them open source the drivers so the community can do it then.
They started 9 years ago, but they remained popular into 2020 and according to wikipedia the last new pascal model was released in 2022. The 1080 and the 1060 are both still pretty high up on the Steam list of the most common GPUs.
What model came out in 2022? The newest I could find was the GT 1010 from 2021 (which is more of a video adapter than an actual graphics card) but that’s the exception. The bulk of them came out in 2016 and 2017 https://www.techpowerup.com/gpu-specs/?f=architecture_Pascal
Hate to break it to ya, but 2020 was 5 years ago. More than half of these GPUs lifespan ago. Nvidia is a for profit company, not your friend. You can’t expect them to support every single product they’ve ever released forever. And they’re still doing better than AMD in that regard.
You can’t expect them to support every single product they’ve ever released forever. And they’re still doing better than AMD in that regard.
If nvidia had the pre-GSP cards’ drivers opensourced at least there would be a chance of maintaining support. But nvidia pulled the plug.
Intel’s and AMD’s drivers in the Mesa project will continue to receive support.
For example, just this week: Phoronix: Linux 6.19’s Significant ~30% Performance Boost For Old AMD Radeon GPUs These are GCN1 GPUs from 13yrs ago.
thanks to work by Valve
AMD did nothing to make their drivers better, Vale did.
AMD did nothing to make their drivers better, Vale did.
That’s literally the point of open source though, both AMD and Intel rely on open source drivers, so anybody can fix a flaw they encounter without having to rely on the company to “consider allocating resources towards a fix for legacy hardware”
Making them open to contributions was the first step, but ok I won’t engage in this petty tribalism.
The topic was about nvidia’s closed source drives.
Valve couldn’t do the same for pascal GPUs. Nobody but nvidia has the reclocking firmware, so even the reverse engineered nouveau NVK drivers are stuck at boot clock speeds.
If they’re going to release things under a proprietary license and send lawyers after individuals just trying to get their hardware to work, then yes, yes I can.
Don’t want to support it anymore? Fine. Open source it and let the community take over.
That’s why I don’t like closed source proprietary. They decide to stop the support.
I wasted days of my life getting nVidia to work on Linux. Too much stress. Screw that. Better ways to spend time. If I can’t game, that’s OK too.
I switched from a 3080 to a 7900 xt. It’s one of the better decisions I’ve made even though on paper the performances are not too far away.
I’m told AMD works better with Linux, but I haven’t tried it myself.
AMD is and has been much more friendly towards linux than nivdia. I run mine in proxmox passing through to linux and windows gaming VMs. AMD has invested in open source drivers.
https://thetechylife.com/does-amd-support-linux/
https://arstechnica.com/information-technology/2012/06/linus-torvalds-says-f-k-you-to-nvidia/

AMD is plug and play on Linux. With my 7800XT there isn’t a driver to install. Only issue is that AMD doesn’t make anything that competes with the 5080/5090.
Only “issue” is that AMD doesn’t make anything that competes with the 5080/5090.
And do you really need the performance of a 5080? Certainly not that of a 5090.
My 9070 XT runs everything I need at perfectly acceptable rates on maximum settings. AAA games among them.
That’s such a bad way to look at it. I would’ve bought a 5090 if I could afford it because I want to hold onto the 5090 for almost a decade like I did with my 1080. Depending on prices, it doesn’t make sense to upgrade twice in 10 years because you bought a budget option, and then be stuck trying to sell a budget card. 5090s will hold their value for years to come. Good luck playing AAA titles maxed out in 5 years on a 7800XT.
Generally, you’ll get better results by spending half as much on GPUs twice as often. Games generally aren’t made expecting all their players to have a current-gen top-of-the-line card, so you don’t benefit much from having a top-of-the-line card at first, and then a couple of generations later, usually there’s a card that outperforms the previous top-of-the-line card that costs half as much as it did, so you end up with a better card in the long run.
My 7800XT can’t play Hogwarts Legacy without stuttering (on Linux). I’m really regretting not getting a 5080 at this point.
Yeah, I am looking at spending less than I did before though. But when will an under £200 card give like double the performance of a 2070? I don’t want to spend that much for +20%. Unless my current card dies there is little reason to upgrade.
Good luck playing AAA titles maxed out in 5 years on a 5080 too… 5090 isn’t even considered a consumer card anyway, it’s more like an enthusiast, collector’s item. It’s so expensive compared to its performance value.
You have to look at performance-to-price ratio. That’s the only metric that matters, and should determine how much you can sell it for when upgrading, and how often you upgrade.
I don’t want to play AAA games now, why would I want to with 5 more years of further enshitification?
Open source drivers are a major plus, I’ve had a much easier time than my partner on NVIDIA. I mean I make both machines work but the NVIDIA has been a real pig at times.
I can’t believe they would do this to poor Borland. I guess I’ll just need to use an AMD GPU for my Turbo Pascal fun.




















