Adding to Cart…
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.
Comments
Good point about newer cpus only supporting spyware and memory clogging Win10, and about PCIE lanes.
Intel Cpus are servely crippled on the lanes, using m.2 SSD's is not an option if you gonna run several GPUS as well with high bandwidth.
But does Bandwidth matters for rendering? Do we need it?
I think the bandwidth only matters as the 'kernel' are generated for the rendertask and data is sent to gpu ram, after that I only think minimal amounts of bandwidth are used. So slower bandwidth to the GPU might not be a bottleneck for more then a few seconds.
What I wrote above is what I think based on what I know from other things, but I'm not sure if it applies to rendering with daz3d. If anyone knows for sure please make things clear! :)
I would go with the 8 core Threadripper.
Important bits:
...I'll stick with pre Kaby Lake CPUs as they allow me to work under W7. W10 only compatibility is not worth hamstringing VRAM capacity of an expesnive high memory GPU card.
Again unless you are going to do extreme multitasking Daz will not use all those cores (unless the render dumps from the GPU to the CPU). If you are also working in Carrara that is a different story since Carrara natively only supports CPU rendering and will make use of up to 100 CPU cores/threads.
...oh to have a 32 core Epyc workstation for that.
There are benefits to multiple core CPUs that aren't mentioned so much.
You can do more. I use multiple instances of Daz, and they definitely help there, and when the scene won't fit on the card, despite optimising, CPU it is.
Win 10 is totally worth it with a higher resolution monitor. There is no real DPI scaling support in Win 7. With a 4K resolution monitor, it sucks. Really bad.
It kinda sucks with the VRAM reservation, I hope there is a solution eventually but I haven't run into a problem going from Win 7 to Win 10.
Keep in mind us people rendering with 2 or more high-end consumer or prosumer cards probably don't fall under the "most users" category.
+1
I wasn't aware that a 4k monitor image would look different under Win 7 compared to Win 10 - thanks for posting this, it's good to know.
I had put off upgrading to Windows 10 but I finally took a leap 10 months ago because under Windows 7 you can't take full advantage of VR. So for me it's a choice between losing some VRAM or losing some functionality of the Oculus. Ideally, it would be nice to have different powerful pc's for different jobs but most people's computers have to compromise somewhere. Agreed though, Microsoft really do need to do something about this.
...yeah, if you also use your system for gaming, VR, and streaming films then unfortunately W10 is the only option (doesn't 8.1 at least support 4K resolution?). For just production and rendering, I don't see it as all that crucial to wind up with the "idiotic-syncrasies" of W10 like force fed updating, rubbish features like Cortana (that require hacking the registry to totally disable), and the issue with reserving VRAM.
My rule of thumb, the OS should never get in the way of what you do.
As I heard, Nvidia's Quadro line supposedly doesn't suffer from the VRAM issue. However, how many of us can afford a GPU card that costs as much or more than an entire system?
It's not that earlier versions of windows can't output 4k resolutions. The issue lies in the OS ability to scale the elements of your desktop and apps, to make them a usable and readable size at that resolution.
...again the only rteason I see 4K as being useful is for gaming and viewing films. As I do neither on my workstation, W7 is fine. I also don't have 4K displays so it's pretty moot.
Not sure if serious...
Average screen is HD (1920x1080), you can't see having 4 TIMES the display as being useful? All the apps that support DPI scaling are wonderful. Scale text up, it is so sharp & clear and easy to read. Like printed on quality paper.
And Daz Studio? On my HD monitor it feels like I'm trying to run it on my phone.
You're a young guy, right? It's an age thing. I run my 27" monitors at 2.5K (except for the drawing tablet). That's almost perfect for my eyes. Sometimes I need to scale the text up. While I shoot in 4K, I can't bare to preview it on anything smaller than 32". As you age, the eyes are one of the first things to go.
Not that young. Argh, tell me about it my eyes are going too. Running Win 10 with two monitors, a 28" 4K, and a 27" 2K. My DPI is scaled up quite a bit with the 4K screen which is also the main screen. I scale web browsers and Word up quite a bit again, so the text is large and very clear.
I thought I read somewhere that the whole 4k thing in movies, etc., is more than most people can even distinguish. I don't play games, and I've never tried 4k, but I'm also one of those people who wouldn't spend much more money to get 4k.
1920x1080 works fine for me. Does 4k really look that much better for games and such? Or maybe if you're real close on one of those huge monitors?
Yes. Monitors are a much different story. With TVs, depending on size the average viewing distance may be too far to distinguish between HD and UHD. But with a monitor, unless it's really, really small you will be able to see individual pixels. On a 27" HD monitor that's only 81 pixels per inch. On a 21" monitor it's 104 PPI, better but you should still be able to distinguish dots at monitor viewing distances.
For a 28" UHD monitor it's only 158 PPI, young 'uns should be able to see pixels I guess fortunately my eyes can't make them out at that dot pitch.
4K definitely has its advantages in certain situations. As scott762* says, it is mostly about screen size and viewing distance. On a 65" screen@ 10', you can clearly see the difference between 1080p and 4K. On a 100" projector screen, the difference is obvious. If I have a choice, I will choose 4K but if viewing a 55" monitor from 10 feet away, there is a diminishing return.
I thought they had said that even in a movie theater with the humongous screens that 4k is a bit of overkill cuz people can't really distinguish. But hey I could have mis-remembered.
It's interesting though...what are all the monitor manufacturers gonna do when the pixel density hits the point that additional improvements make zero difference to mortals. I guess they'll have to come up with some other marketing scheme.
And speaking of CPU's, ever notice that the manufacturers seem to release new technologies on a fairly regular and steady basis? Which is good for their bottom line. Though I can't imagine that this "new technology" is something they just come up with, or if it's a controlled release. Maybe they could release a 10 GHz, 256 core CPU tomorrow but they don't because that would cut their steady income for the next 3 decades...
I used to design home theaters so I was constantly exposed to cutting edge display technology. Back in 2009 or so, Sony was at CEDIA was demoing Spiderman and a couple more of their movies on their prototype 8K projectors on a 170" screen. It was spectacular. It was hard for anyone who saw that demo to go back to their "regular" 1080p displays. 4K is no overkill for the right applications.
btw, it may be true what you heard, that theater owners think 4K is overkill. They have to pay for the costs of upgrading. But theater screens don't hold a candle to a properly setup home theater. They are usually poor quality and flawed, which is why I rarely see a movie in a public theater.
Yeah, you're probably right.
Though I did a quick search and found a CNET article (from 2015, so maybe it no longer applies) that said 4k TV's are "stupid", based on a number of factors. I didn't read it in detail, so maybe it's just a "click bait" title, but anyway...
https://www.cnet.com/news/why-ultra-hd-4k-tvs-are-still-stupid/
EDIT: Looks like the point he's making is that for 4K to be really noticeable, you need a monitor greater than like 70 or 80 inches. And his point is that very few people can even fit something like that in a room.
To steer back on topic a bit, just read an article about the Threadripper 1900x... dang. Rather than using just one die, it uses two with only 4 cores enabled each connected with AMD's Infinity fabric. Apparently they need to for the quad channel mem support. Bad news in my mind is that likely increases the latency for accessing the memory on the other die. Still no reviews on it yet, so we'll have to see.
And all of that is somewhat irrelevant if it doesn't significantly affect render times or anything else we do with our computers. Personally, if the latency increases and that affects my render times by only 0.01%, I couldn't care less. Is memory latency a big issue in real life?
I don't think there is a universal answer, performance wise some apps are more impacted than others. For example, many games typically favor fast single core speed and low memory latency.
Still, all the PCI lanes should benefit.