Which? Because consoles just use AMD APUs which have the exact same hardware features as their current CPUs and GPUs. UE5 games run like crap on consoles too.
It literally says in the article. Hardware IO controllers that handle compression. I guess this is related to DirectStorage but it doesn’t seem like that takes advantage of dedicated hardware on PC (because as far as I know it doesn’t exist) and apparently only a handful of games actually use it.
They also have integrated RAM (like Apple M-series laptops).
All of which are completely irrelevant as to why games run like crap. Those things have zero impact on the game’s framerate, they only affect asset loading and streaming, and even then they do pretty much nothing from what I can see.
I’m not gonna say it’s just marketing, but it comes close imo. I personally benchmarked Ratchet and Clank: Rift Apart’s loading times between a PS5, an NVMe SSD and a SATA SSD. Literally no difference, save for the SATA one being a fraction of a second slower. And that was one of the games thar was supposed to showcase what that technology can do!
(I know it doesn’t run on UE5, but it’s just an example)
UE5 runs like garbage on all platforms. You can load assets as fast as you want, but if the rendering pipeline is slow as hell it doesn’t matter, games will still run like garbage regardless.
My comment on a different post relates to this well:
I think a lot of the plugin tooling for Unreal promotes bad practices with asset management, GPU optimization, and memory management. I’m trying to say that it allows shitty/lazy developers and asset designers to rely on over-expensive hardware to lift their unoptimized dogshit code, blueprints, models, and textures to acceptable modern fps/playability standards. This has been prevalent for a few years but it’s especially egregious now. Young designers with polygon and vertex counts that are out of control. Extraneous surfaces and naked edges. Uncompressed audio. Unbaked lighting systems. Memory leaks.
I’ve found that in my personal experience in experimenting with Unreal, the priority matches developing DNNs and parametric CAD modelling applications for my day job: effective resource, memory, and parallelism management from the outset of a project is (or should be) axiomatic.
I think Unreal 5 runs exceptionally well when that’s the case. A lot of the time, one can turn off all of the extra hardware acceleration and frame generation AI crap if your logic systems and assets are designed well.
I know this is a bit of an “old man yells at cloud” rant, but if one codes and makes models like ass, of course their game is gonna turn out like ass. And then they turn around and say “tHe EnGiNe SuCkS”.
I can’t think of a single AAA game using UE5 that requires that level of performance due to its size and complexity and doesn’t use audio middleware, which by default compresses audio when generating soundbanks. I have no idea where this myth of “everyone uses uncompressed audio” came from, but it’s annoying and wrong most of the time, as most social media misinformation is. Maybe people think there’s real-time compression of audio at runtime in shipped games? Idk, because that’s just not how anything works; audio files are usually pre-compressed into a nearly lossless audio format before the game binaries are even compiled into a .exe for distribution, and there are usually unique compression settings per-area-of-your-game to further compress less-critical audio into the smallest filesize possible.
Source: literally generating Vorbis/WEM Opus files (for Playstation) in Wwise as I type this.
Admittedly, the audio argument was based on my own hearsay and light reading about the subject (in-game audio).
I’ve never dealt with audio too much in Unreal, but it was my understanding that games began ballooning in size in the early 2010s due to shipping with uncompressed audio files. If my memory serves me, the biggest example that comes to mind was Titanfall (whether 1 or 2, I can’t recall). That of course wasn’t made in Unreal; my unfounded assumption was that it had carried over into audio systems-design Unreal 5. Clearly, I’m wrong.
My work is around DNN-powered CAD asset generation, procedurally and statically, and I do actively experiment with Unreal environments. I still stand by my stated criticisms of young developers and designers in areas related to environment logic systems and asset creation and optimization.
Not true. It takes advantage of hardware features that are available on consoles but not on PC. That isn’t laziness.
Which? Because consoles just use AMD APUs which have the exact same hardware features as their current CPUs and GPUs. UE5 games run like crap on consoles too.
It literally says in the article. Hardware IO controllers that handle compression. I guess this is related to DirectStorage but it doesn’t seem like that takes advantage of dedicated hardware on PC (because as far as I know it doesn’t exist) and apparently only a handful of games actually use it.
They also have integrated RAM (like Apple M-series laptops).
All of which are completely irrelevant as to why games run like crap. Those things have zero impact on the game’s framerate, they only affect asset loading and streaming, and even then they do pretty much nothing from what I can see.
I’m not gonna say it’s just marketing, but it comes close imo. I personally benchmarked Ratchet and Clank: Rift Apart’s loading times between a PS5, an NVMe SSD and a SATA SSD. Literally no difference, save for the SATA one being a fraction of a second slower. And that was one of the games thar was supposed to showcase what that technology can do! (I know it doesn’t run on UE5, but it’s just an example)
UE5 runs like garbage on all platforms. You can load assets as fast as you want, but if the rendering pipeline is slow as hell it doesn’t matter, games will still run like garbage regardless.
US5 console games didn’t come out super optimized to me too lol
My comment on a different post relates to this well:
I can’t think of a single AAA game using UE5 that requires that level of performance due to its size and complexity and doesn’t use audio middleware, which by default compresses audio when generating soundbanks. I have no idea where this myth of “everyone uses uncompressed audio” came from, but it’s annoying and wrong most of the time, as most social media misinformation is. Maybe people think there’s real-time compression of audio at runtime in shipped games? Idk, because that’s just not how anything works; audio files are usually pre-compressed into a nearly lossless audio format before the game binaries are even compiled into a .exe for distribution, and there are usually unique compression settings per-area-of-your-game to further compress less-critical audio into the smallest filesize possible.
Source: literally generating Vorbis/WEM Opus files (for Playstation) in Wwise as I type this.
Admittedly, the audio argument was based on my own hearsay and light reading about the subject (in-game audio).
I’ve never dealt with audio too much in Unreal, but it was my understanding that games began ballooning in size in the early 2010s due to shipping with uncompressed audio files. If my memory serves me, the biggest example that comes to mind was Titanfall (whether 1 or 2, I can’t recall). That of course wasn’t made in Unreal; my unfounded assumption was that it had carried over into audio systems-design Unreal 5. Clearly, I’m wrong.
My work is around DNN-powered CAD asset generation, procedurally and statically, and I do actively experiment with Unreal environments. I still stand by my stated criticisms of young developers and designers in areas related to environment logic systems and asset creation and optimization.
Apologies for the unnecessary audio argument.