What CPU to buy for DAZ3D, 1800x, TR 1950x, i7 5960X, i7 9600K

2»

Comments

  • kyoto kidkyoto kid Posts: 41,854
    edited September 2017
    ...I was looking at a Broadwell 6 core i7 last night and it's specs indicate support for 40 PCIe lanes. I can get 40 lanes with two old 8 core Sandy Bridge Xeons which together would cost about 350$
    Post edited by kyoto kid on
  • Good point about newer cpus only supporting spyware and memory clogging Win10, and about PCIE lanes.

    Intel Cpus are servely crippled on the lanes, using m.2 SSD's is not an option if you gonna run several GPUS as well with high bandwidth.

    But does Bandwidth matters for rendering? Do we need it?

    I think the bandwidth only matters as the 'kernel' are generated for the rendertask and data is sent to gpu ram, after that I only think minimal amounts of bandwidth are used. So slower bandwidth to the GPU might not be a bottleneck for more then a few seconds.

    What I wrote above is what I think based on what I know from other things, but I'm not sure if it applies to rendering with daz3d. If anyone knows for sure please make things clear! :)

  • JamesJABJamesJAB Posts: 1,766

    I would go with the 8 core Threadripper.
    Important bits:

    • Quad Channel RAM
    • 64 PCIE lanes
    • Cost (allows for high end GPU purchase)
  • kyoto kidkyoto kid Posts: 41,854

    ...I'll stick with pre Kaby Lake CPUs as they allow me to work under W7.  W10 only compatibility is not worth hamstringing VRAM capacity of an expesnive high memory GPU card.

    Again unless you are going to do extreme multitasking Daz will not use all those cores (unless the render dumps from the GPU to the CPU).  If you are also working in Carrara that is a different story since Carrara natively only supports CPU rendering and will make use of up to 100 CPU cores/threads.

    ...oh to have a 32 core Epyc workstation for that.

  • nicsttnicstt Posts: 11,715

    There are benefits to multiple core CPUs that aren't mentioned so much.

    You can do more. I use multiple instances of Daz, and they definitely help there, and when the scene won't fit on the card, despite optimising, CPU it is.

  • GatorGator Posts: 1,319
    kyoto kid said:

    ...I'll stick with pre Kaby Lake CPUs as they allow me to work under W7.  W10 only compatibility is not worth hamstringing VRAM capacity of an expesnive high memory GPU card.

    Again unless you are going to do extreme multitasking Daz will not use all those cores (unless the render dumps from the GPU to the CPU).  If you are also working in Carrara that is a different story since Carrara natively only supports CPU rendering and will make use of up to 100 CPU cores/threads.

    ...oh to have a 32 core Epyc workstation for that.

    Win 10 is totally worth it with a higher resolution monitor.  There is no real DPI scaling support in Win 7.  With a 4K resolution monitor, it sucks.  Really bad.

    It kinda sucks with the VRAM reservation, I hope there is a solution eventually but I haven't run into a problem going from Win 7 to Win 10.

  • ebergerlyebergerly Posts: 3,255
    JamesJAB said:

    I would go with the 8 core Threadripper.
    Important bits:

    • Quad Channel RAM
    • 64 PCIE lanes
    • Cost (allows for high end GPU purchase)

    FWIW i think TechDeals was pretty much dont bother with Threadripper, in favor of a Ryzen 7 1700 for most people. Again it depends on your use, but i think if you get a 10%performance boost on some stuff and it costs $500, you might step back and re-evaluate
  • GatorGator Posts: 1,319
    ebergerly said:
    JamesJAB said:

    I would go with the 8 core Threadripper.
    Important bits:

    • Quad Channel RAM
    • 64 PCIE lanes
    • Cost (allows for high end GPU purchase)

     

    FWIW i think TechDeals was pretty much dont bother with Threadripper, in favor of a Ryzen 7 1700 for most people. Again it depends on your use, but i think if you get a 10%performance boost on some stuff and it costs $500, you might step back and re-evaluate

    Keep in mind us people rendering with 2 or more high-end consumer or prosumer cards probably don't fall under the "most users" category.  wink

  • nicsttnicstt Posts: 11,715
    ebergerly said:
    JamesJAB said:

    I would go with the 8 core Threadripper.
    Important bits:

    • Quad Channel RAM
    • 64 PCIE lanes
    • Cost (allows for high end GPU purchase)

     

    FWIW i think TechDeals was pretty much dont bother with Threadripper, in favor of a Ryzen 7 1700 for most people. Again it depends on your use, but i think if you get a 10%performance boost on some stuff and it costs $500, you might step back and re-evaluate

    Keep in mind us people rendering with 2 or more high-end consumer or prosumer cards probably don't fall under the "most users" category.  wink

    +1

  • Win 10 is totally worth it with a higher resolution monitor.  There is no real DPI scaling support in Win 7.  With a 4K resolution monitor, it sucks.  Really bad.

    It kinda sucks with the VRAM reservation, I hope there is a solution eventually but I haven't run into a problem going from Win 7 to Win 10.

     

    I wasn't aware that a 4k monitor image would look different under Win 7 compared to Win 10 - thanks for posting this, it's good to know.

    I had put off upgrading to Windows 10 but I finally took a leap 10 months ago because under Windows 7 you can't take full advantage of VR.  So for me it's a choice between losing some VRAM or losing some functionality of the Oculus.  Ideally, it would be nice to have different powerful pc's for different jobs but most people's computers have to compromise somewhere.  Agreed though, Microsoft really do need to do something about this.

  • kyoto kidkyoto kid Posts: 41,854
    edited September 2017

    ...yeah, if you also use your system for gaming, VR, and streaming films then unfortunately W10 is the only option (doesn't 8.1 at least support 4K resolution?).  For just production and rendering, I don't see it as all that crucial to wind up with the "idiotic-syncrasies" of W10 like force fed updating, rubbish features like Cortana (that require hacking the registry to totally disable), and the issue with reserving VRAM. 

    My rule of thumb, the OS should never get in the way of what you do.

    As I heard, Nvidia's Quadro line supposedly doesn't suffer from the VRAM issue. However, how many of us can afford a GPU card that costs as much or more than an entire system?

    Post edited by kyoto kid on
  • JamesJABJamesJAB Posts: 1,766

    It's not that earlier versions of windows can't output 4k resolutions.  The issue lies in the OS ability to scale the elements of your desktop and apps, to make them a usable and readable size at that resolution.

  • kyoto kidkyoto kid Posts: 41,854

    ...again the only rteason I see 4K as being useful is for gaming and viewing films. As I do neither on my workstation, W7 is fine. I also don't have 4K displays so it's pretty moot.

  • GatorGator Posts: 1,319
    kyoto kid said:

    ...again the only rteason I see 4K as being useful is for gaming and viewing films. As I do neither on my workstation, W7 is fine. I also don't have 4K displays so it's pretty moot.

    Not sure if serious...

    Average screen is HD (1920x1080), you can't see having 4 TIMES the display as being useful?  All the apps that support DPI scaling are wonderful.  Scale text up, it is so sharp & clear and easy to read.  Like printed on quality paper. 

    And Daz Studio?  On my HD monitor it feels like I'm trying to run it on my phone.  laugh

  • drzapdrzap Posts: 795
    kyoto kid said:

    ...again the only rteason I see 4K as being useful is for gaming and viewing films. As I do neither on my workstation, W7 is fine. I also don't have 4K displays so it's pretty moot.

    Not sure if serious...

    Average screen is HD (1920x1080), you can't see having 4 TIMES the display as being useful?  All the apps that support DPI scaling are wonderful.  Scale text up, it is so sharp & clear and easy to read.  Like printed on quality paper. 

    And Daz Studio?  On my HD monitor it feels like I'm trying to run it on my phone.  laugh

    You're a young guy, right?  It's an age thing.   I run my 27" monitors at 2.5K (except for the drawing tablet). That's almost perfect for my eyes.  Sometimes I need to scale the text up.  While I shoot in 4K, I can't bare to preview it on anything smaller than 32".  As you age, the eyes are one of the first things to go.

  • GatorGator Posts: 1,319
    drzap said:
    kyoto kid said:

    ...again the only rteason I see 4K as being useful is for gaming and viewing films. As I do neither on my workstation, W7 is fine. I also don't have 4K displays so it's pretty moot.

    Not sure if serious...

    Average screen is HD (1920x1080), you can't see having 4 TIMES the display as being useful?  All the apps that support DPI scaling are wonderful.  Scale text up, it is so sharp & clear and easy to read.  Like printed on quality paper. 

    And Daz Studio?  On my HD monitor it feels like I'm trying to run it on my phone.  laugh

    You're a young guy, right?  It's an age thing.   I run my 27" monitors at 2.5K (except for the drawing tablet). That's almost perfect for my eyes.  Sometimes I need to scale the text up.  While I shoot in 4K, I can't bare to preview it on anything smaller than 32".  As you age, the eyes are one of the first things to go.

    Not that young.  Argh, tell me about it my eyes are going too.  Running Win 10 with two monitors, a 28" 4K, and a 27" 2K.  My DPI is scaled up quite a bit with the 4K screen which is also the main screen.  I scale web browsers and Word up quite a bit again, so the text is large and very clear.

  • kyoto kidkyoto kid Posts: 41,854
    edited September 2017
    ...old eyes here and 1920 x 1080 is fine enough. For one, do Daz and Carrara even supprt 4K? Second, I am primarily producing images for Net based publishing and printing. The average person who reads or downloads a comic or novel is at best likely running in a lower resolution than 4K (particularly on a tablet or Kindle) and for printing purposes, display resolution is meaningless.
    Post edited by kyoto kid on
  • ebergerlyebergerly Posts: 3,255

    I thought I read somewhere that the whole 4k thing in movies, etc., is more than most people can even distinguish. I don't play games, and I've never tried 4k, but I'm also one of those people who wouldn't spend much more money to get 4k. 

    1920x1080 works fine for me. Does 4k really look that much better for games and such? Or maybe if you're real close on one of those huge monitors?

  • GatorGator Posts: 1,319
    ebergerly said:

    I thought I read somewhere that the whole 4k thing in movies, etc., is more than most people can even distinguish. I don't play games, and I've never tried 4k, but I'm also one of those people who wouldn't spend much more money to get 4k. 

    1920x1080 works fine for me. Does 4k really look that much better for games and such? Or maybe if you're real close on one of those huge monitors?

    Yes.  Monitors are a much different story.  With TVs, depending on size the average viewing distance may be too far to distinguish between HD and UHD.  But with a monitor, unless it's really, really small you will be able to see individual pixels.  On a 27" HD monitor that's only 81 pixels per inch.  On a 21" monitor it's 104 PPI, better but you should still be able to distinguish dots at monitor viewing distances.

    For a 28" UHD monitor it's only 158 PPI, young 'uns should be able to see pixels I guess fortunately my eyes can't make them out at that dot pitch.  laugh 

  • drzapdrzap Posts: 795
    edited September 2017

    4K definitely has its advantages in certain situations.  As scott762* says, it is mostly about screen size and viewing distance.  On a 65" screen@ 10', you can clearly see the difference between 1080p and 4K.  On a 100" projector screen, the difference is obvious.  If I have a choice, I will choose 4K but if viewing a 55" monitor from 10 feet away, there is a diminishing return.

    Post edited by drzap on
  • ebergerlyebergerly Posts: 3,255

    I thought they had said that even in a movie theater with the humongous screens that 4k is a bit of overkill cuz people can't really distinguish. But hey I could have mis-remembered. 

    It's interesting though...what are all the monitor manufacturers gonna do when the pixel density hits the point that additional improvements make zero difference to mortals. I guess they'll have to come up with some other marketing scheme.

    And speaking of CPU's, ever notice that the manufacturers seem to release new technologies on a fairly regular and steady basis? Which is good for their bottom line. Though I can't imagine that this "new technology" is something they just come up with, or if it's a controlled release. Maybe they could release a 10 GHz, 256 core CPU tomorrow but they don't because that would cut their steady income for the next 3 decades... smiley

  • drzapdrzap Posts: 795
    edited September 2017
    ebergerly said:

    I thought they had said that even in a movie theater with the humongous screens that 4k is a bit of overkill cuz people can't really distinguish. But hey I could have mis-remembered. 

     

    I used to design home theaters so I was constantly exposed to cutting edge display technology.  Back in 2009 or so, Sony was at CEDIA was demoing Spiderman and a couple more of their movies on their prototype 8K projectors on a 170" screen.  It was spectacular.  It was hard for anyone who saw that demo to go back to their "regular" 1080p displays.  4K is no overkill for the right applications.

    btw, it may be true what you heard, that theater owners think 4K is overkill.  They have to pay for the costs of upgrading.  But theater screens don't hold a candle to a properly setup home theater.  They are usually poor quality and flawed, which is why I rarely see a movie in a public theater.

    Post edited by drzap on
  • ebergerlyebergerly Posts: 3,255
    edited September 2017

    Yeah, you're probably right. 

    Though I did a quick search and found a CNET article (from 2015, so maybe it no longer applies) that said 4k TV's are "stupid", based on a number of factors. I didn't read it in detail, so maybe it's just a "click bait" title, but anyway...

    https://www.cnet.com/news/why-ultra-hd-4k-tvs-are-still-stupid/

    EDIT: Looks like the point he's making is that for 4K to be really noticeable, you need a monitor greater than like 70 or 80 inches. And his point is that very few people can even fit something like that in a room.

    Post edited by ebergerly on
  • GatorGator Posts: 1,319

    To steer back on topic a bit, just read an article about the Threadripper 1900x... dang.  Rather than using just one die, it uses two with only 4 cores enabled each connected with AMD's Infinity fabric.  Apparently they need to for the quad channel mem support.  Bad news in my mind is that likely increases the latency for accessing the memory on the other die.  Still no reviews on it yet, so we'll have to see.

  • ebergerlyebergerly Posts: 3,255

    To steer back on topic a bit, just read an article about the Threadripper 1900x... dang.  Rather than using just one die, it uses two with only 4 cores enabled each connected with AMD's Infinity fabric.  Apparently they need to for the quad channel mem support.  Bad news in my mind is that likely increases the latency for accessing the memory on the other die.  Still no reviews on it yet, so we'll have to see.

    And all of that is somewhat irrelevant if it doesn't significantly affect render times or anything else we do with our computers. Personally, if the latency increases and that affects my render times by only 0.01%, I couldn't care less. Is memory latency a big issue in real life? 

  • GatorGator Posts: 1,319
    ebergerly said:

    To steer back on topic a bit, just read an article about the Threadripper 1900x... dang.  Rather than using just one die, it uses two with only 4 cores enabled each connected with AMD's Infinity fabric.  Apparently they need to for the quad channel mem support.  Bad news in my mind is that likely increases the latency for accessing the memory on the other die.  Still no reviews on it yet, so we'll have to see.

    And all of that is somewhat irrelevant if it doesn't significantly affect render times or anything else we do with our computers. Personally, if the latency increases and that affects my render times by only 0.01%, I couldn't care less. Is memory latency a big issue in real life? 

    I don't think there is a universal answer, performance wise some apps are more impacted than others.  For example, many games typically favor fast single core speed and low memory latency.

    Still, all the PCI lanes should benefit.

Sign In or Register to comment.