Hi
@Baal Netbeck ,
Haha, don’t worry, all feedback is constructive. I’ll try to respond to everything to explain the reasoning behind each thing, my perspective, and invite you to share your opinions, as there’s always a choice between A or B. Unfortunately, not all can be chosen at once, and I’ve tried to find the best balance possible.
Baal Netbeck schrieb:
I can use 3DMark Benchmarks with a lot more data to compare my system to.
Making it hard to sell in the first place.
More data to compare, for now (it depends on you!) 😋
However, I think my benchmark offers a more reliable, straightforward, and user-friendly comparison system. You can even check the relative performance of any GPU (just by clicking on it) against others, interactively:
Regarding sales, there’s also a Free version (which will become even more complete). You don’t need to buy it unless you want full insight into your GPU, whenever and however you like, with all the commodities. That being said, the full version—sounds presumptuous, but if I don’t say it myself…—is, in my opinion, better than what 3DMark or any other standalone benchmark offers:
- Less synthetic, more like an actual game.
- Faster and straight to the point.
- Provides more real-time info on hardware and performance.
- Technologically more advanced.
- Uses a real game engine. Not just any engine, but one that’s widely adopted and highly advanced—the only logical choice if you had to pick just one.
- The scene is richer and more detailed.
I might be forgetting some aspects, but I’d say it outperforms them in pretty much everything (except for database size, which depends on users like you, and web-based data, which I don’t think is necessary given everything else).
Baal Netbeck schrieb:
It is UE5, but every game is different. And Gameplay is very different.
In a game you will most likely be running, interacting, quick camera movements, camera shake from first person walking or 3rd person view, barely any camera jumps, effects like sparks, flash, magic...world streaming when traveling.....etc
This benchmark features a slowly gliding/flying kamera. It jumps positions...positions nearby, there are no effects, no sudden movements.
So I don't think it is representing what to expect when playing an UE5 game.
I know that this is the way, a lot of benchmarks are done. Even most "in game benchmarks" are doing it this way.
That is why good Reviews are using "in game testing" and rarely the integrated benchmarks.
I’d say this is the closest standalone benchmark to an actual game:
- It runs on a modern engine, one that’s widely used in the industry.
- It features enough assets to build a full-fledged game.
- It incorporates cutting-edge technologies like Nanite, Lumen, and Megalights—the most impactful on GPU performance and the ones that will see increasing adoption in future games.
Of course, every Unreal Engine 5 game has its quirks, but this is the most general, common setup, making it the most representative and extrapolatable to real games. Other benchmarks don’t use modern engines, aren’t industry-standard, and lack the assets needed to simulate a complete game environment.
Now, about it being a controlled cinematic—yes, that’s true, but the impact on results is minimal, if any. This benchmark is designed to measure GPU performance, so visually, you can get a very clear idea of how your GPU handles top-tier Unreal Engine graphics.
Sure, other games might be larger, have level-loading times, or feature anywhere from 1 to 1,000 NPCs, but that wouldn't reflect GPU performance a lot. That would be more about CPU, SSD, and RAM limitations—which this benchmark isn’t meant to measure (right now, at least).
In fact, if a game stutters due to CPU or SSD loads, you’ll always experience those stutters, even if you upgrade those components (they’ll just reduce them). But if you want to improve general smoothness when the game isn’t freezing up for loading, the main component is usually the GPU (except on high CPU demandant games). That’s precisely what this benchmark is designed to evaluate: how well your GPU can handle a high-fidelity Unreal Engine 5 environment in real time, regarding only graphics.
Baal Netbeck schrieb:
It could be a nice tool to test different Options and observe the impact on visuals and performance. But having different Options ruins the "compare to others" aspect of the benchmark.... Even less systems using the same options to compare to.
And I hate a blurry image.
This benchmark is blurry all over the place. And the depth of field effect is going on top of that.
Might be TAA+ Low FPS at work, but it looks like 320p upscaled to 1080p.
I tried to determine if the texture quality is any good, but I could not tell because of the blur.
I don't want to hate to much, but it is ugly as hell(my personal opinion. I know of some other users on Computerbase, who love a blurry TAA image)... I would refuse to play a game looking like this.
I understand that last sentence, but I hope you find this useful as a benchmark, however
Some gameplay focused-subject sharp (I think!) images during free walkthorugh (cinematic has bigger focal length, for dramatic and artistic purposes, but you should center your vision in the focused center of screen):
It’s possible that an in-game menu will be added later (for the walkthrough section - which also measures instant FPS - so that the cinematic measurements are still done under the same conditions, without too many options), so you can compare the quality and performance differences. But this would only be for those curious; that is, with the "fixed" setting that’s already in place, the relative performance difference between GPUs is already being measured, and with different options, the difference would remain the same (assuming equal graphic settings, of course). That is, in the end, this is a benchmark, not a customizable game, and its purpose is to measure the performance difference between GPUs. Objectively, I think it does that very well (although that doesn’t mean I won’t keep adding lots of extra things over time, like I have been doing!).
Indeed, there are DOF and noise effects, which I thought added realism and believability to the image, making it more cinematic. In a "game," it has to be a bit more exaggerated than in movies, precisely because of the image "base," which will never be as sharp as a 4K filmed movie. So, to make it "noticeable" and achieve a cinematic effect, it needs to be slightly enhanced (also because some people know that real-time DOF is tricky, and I wanted to highlight the good result achieved). Removing the noise and DOF makes the image feel much more like CGI, and removing TAA results in visible pixels and jagged edges in modern games. But as mentioned, I think it’s a matter of taste (it would be good to survey people to see if everyone shares their opinion on this in the thread), and this could be applied to future scenes or modified in the current one over time (as long as it doesn’t affect the benchmark’s performance, so as not to invalidate previous results) when I get a general opinion from all of you. In fact, on another well-known forum, I asked for general feedback from users. In the end, I care more about what you guys like than what I like.
In any case, I understand there are opinions for all tastes, and if this were a game, of course, there would be a very complete options menu to disable DOF and TAA, for example. But since the goal was to measure GPU performance using the same graphic settings, the artistic decision that seemed most balanced was made, but there could only be one.
Here are some comparisons via a slide (clic on it) with effects (DOF + Grain + TAA) versus without effects (I’ve also included the latest comparison of TAA vs NoAA. Do you really notice a big difference? And, in real-time, there are a lot of flickering pixels with NoAA):
PS: Were you running the 1080p benchmark in a higher resolution screen?
I’m looking forward to any additional comments in case there’s something else to dive into!
Thank you very much and best regards!