Adding to Cart…
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.
Comments
Copied/expanded on from here:
Pwr = Power. Indicating performance is limited by total power limit. (logged individually as 1)
Thrm = Thermal. Indicating performance is limited by temperature limit. Root cause: Insufficient cooling. (logged individually as 2)
vRel = Reliability. Indicating performance is limited by voltage reliability. Root cause: Power Supply Performance. (logged individually as 4)
VOp = Operating. Indicating performance is limited by max operating voltage(Hardware Limit). (logged individually as 8)
Util = Utilization. Indicating performance is limited by GPU utilization. (logged individually as 16)
Because of the way GPUBoost works, one or more of these "Reasons" will always be active at any given time for your GPU. The only one to really worry about if you see it is Thermal (Thrm.)
Here's a plot of the PerfCap reason as I have Studio sitting there just doing an Iray preview. It goes from a reason code 16 to a 4. It changes with time, although you can also choose to have it show a Max value.
So I assume you'd have to plot the reason like this during the render and go review the plot when it's done to see if the performance was capped at any point, and if it was capped long enough to matter. Though you don't really know how much it was capped without further research. If it capped by 18% for 12 seconds during a 10 minute render, and then the fans cranked up to end the performance cap do you disqualify the benchmark result?
Yeah, I think I'll pass on all of this PerfCap stuff. And can we even trust it?
OK, thanks. I can see that if I hold the mouse pointer over the PerfCap timeline, a tooltip will display the reason associated with the color. So far I've only seen "Idle" (grey color) and "VRel" (blue color). I assume all the different reasons have different colors with different tooltip text on the timeline?
Re-see this post for what the logfile numbers as well as colors mean (in this case, that you have no serious issues.)
You can alsojust leave the tool as is and simply watch for the color purple appearing during the hottest part of your render (the end.) Not everything has to be complicated.
Yes. I'd try it again. And if it still happens, I'd make a note of it in my results. Then I'd look into getting better cooling for my GPU.
Yes. GPU-Z gets its data directly from the Nvidia driver itself.
Thanks for the color update...
Thread trimmed, please stop butting heads.
When people build charts on benchmarks, they tend to throw the anomalies out. Posting the worst case as the standard is a grave mistake and does everybody injustice. I do not recall you ever once stating this chart was a worst case scenario chart, which misleads everybody who reads it. Did you post the worst case for all the GPUs on that list then? The initial post in that thread is the only one posting that time. Nearly everybody else was under 1 minute. Some people complained that the then new 4.11 beta added 10 seconds, but considering they were already under 60 with 4.10, even after 4.11 they were still faster than the opening post. Something is clearly wrong with the op's settings. Perhaps a bad driver. But the 2nd post in that thread questioned why the op was so slow. Even back then people noticed this 1:20 time as being strange.
The op in that thread has a 2017 Ryzen 1700x that has 8 cores 16 threads, and 64gb of DDR4 RAM. My machine has a 2014 i5-4690k with just 4 cores 4 threads, and has slower DDR3 RAM. Also I only had 16gb for my first tests back in the day, while I have 32gb now. Still, that is half the RAM, and it is slower, and my times are identical with both. We both use Samsung SSD. There is no logical reason why my machine should beat his by so much if you are to believe that different spec can really effect render speeds that much. If these specs actually effected rendering, then my machine should be stomped by the Ryzen 1700x, not the other way around.
Additionally, this 1:20 time undermines the 2080ti arguments you have made. A single 2080ti can render this scene well under that time, providing a speed that exceeds that of 2x 1080tis, while also costing at least $200 less. However, if you go by the bench times I post, the single 2080ti can only match or come close to 2x 1080tis, not beat them. 20 seconds is a very wide margin given the scene only takes 60 or so to complete.
There are standards for benchmarking. You never take the outlier as the reference.
You run the test many times and take the mean. The number of runs helps to establish the error bars of the result.
Just treat the thread as what it is - people posting their times in the then current version of DS using a common scene. it's moderately useful, especially looking at the recent posts, but certainly not systematic or definitive.