Just a week ago, we discussed Microsoft's efforts to enhance File Explorer performance by preloading the application at startup. Recent testing by Windows Latest suggests that this approach results in diminishing returns, as the application uses more RAM while offering only marginal performance impr...
Well it is a problem when it’s consistently getting close then I get tickets that computers are performing poorly shortly after. You’re on the cusp of page thrashing if you’re high on usage.
Your use case is not a typical one either, compared to what I’m dealing with. It sounds like maybe you have a few heavy hitters, but when you’re constantly switching between MS office, web pages, and CAD and things like that it becomes very noticeable. If a machine has 15.6GB used out of 16GB, a single web page could trigger some thrashing.
For example, I regularly saw Palo Alto panorama use around 1GB for just the single web page 😅
Right, but that’s not a high memory problem, that’s a Windows is shit at managing memory problem.
If MS fixed that, you could easily run memory hot at >90% without issue.
It’s also a software developers are making poor products problem. Even back when I was on Windows, I swapped out MS Office for Libre Office and then OnlyOffice. In both cases, my system performed better just by not running MS Office. That’s not a memory usage problem.
On my work laptop. which runs Windows, I removed the entire Adobe suite, which I don’t use for anything, and my overall system responsiveness increased. Again, not a memory issue, an poor programming issue.
Devs (the companies, not the individual programmers) know that users will throw more RAM at a problem, so it absolves them of the need to write better code. If Windows had a better memory manager, and Office and Adobe were more efficient, you wouldn’t need more RAM.
Also, just to clarify a point. Right now, web browsers, the worst abuser of memory, are taking up 24GB of ram on my system.
Because I have no memory swapping issues, I keep many open web browsers, which most people can’t if they are on Windows because it’s crap at memory management.
So our list grows to, crappy memory management on Windows, crappy development of web browsers, crappy development of applications, and crappy web pages (as you say).
None of that is a low memory problem, it’s all poor software development. When RAM was super expensive, developers (again companies, not individuals) got lazy and stopped caring about efficiency.
We don’t need more RAM, we need better code. There is no reason anyone running normal usage should need that much RAM.
To make my point, I just SSHed into my wife’s Linux PC, which she never closes anything, and this is her memory usage with a bunch of browsers doing all the normal things she does, and multiple spreadsheets open in OnlyOffice.
Memory: Total: 16278284 Used: 6254884 Available: 10023400
Edit: BTW, I do understand your point. You can’t fix any of that. My point is we need to put blame where blame is due. And it’s not that memory is low.
It’s not a windows is shit at managing memory problem though. If you have 1MB of RAM left and you open something, something has to happen. A process killed, an alert generated, something moved to disk instead of RAM (paging), or a system lockup or something. This is the management piece. What to do when you’re out.
That is entirely a shit at managing memory problem.
If you have 1 MB of RAM left, firstly, your OS has not properly managed it’s resources. It should have reserved system RAM. Secondly, a good memory manager will have swapped out unused, or low priority pages.
And that’s not just a system issue. A well developed piece of software will unload (or never load) parts of the software that are not needed at runtime.
I’m going to give you a great example I just read about today, about bad programming practices. The install of Helldivers 2 has been reduced from 154GB to 23 GB. That’s a reduction of 85%. This was driven by de-duplication of code. So, while this is a storay about storage space, ask how many modules and functions were duplicated, and how many of those were loaded independently into RAM.
Bad programming in one area, means bad programming in all areas.
With your 1 MB example, I would ask if all of the devs who created all of the other programs on the system had written better and more efficient code, would you still need more RAM? The answer is no.