Just got my new i7-6950X (10-core CPU) 1080ti

124

Comments

  • StratDragonStratDragon Posts: 3,273

    Bob, I think you need to get a second opinion on this CPU which I will magnanimously give you if you send one my way 'cus that's just the kind of guy I am.devil

     

     

     

    Unknown.jpeg
    201 x 250 - 9K
  • BobvanBobvan Posts: 2,653
    edited May 2017

    I have no skill to build a machine either like I said the last one is 4 years old still going. Retired from rendering going to my wife. Im not smarter or better then anybody I was just in a position where I was able to do it. My wife said. "Your building a new machine just get it like that you won't regret it after." I have to give PA_ThePhilosophe props for posting since I caught his post just as I walked in the door after having ordered the machine. They had mentioned it, but I did not fully get that it would be relevant until I read his posting...

    Post edited by Bobvan on
  • BobvanBobvan Posts: 2,653

    Bob, I think you need to get a second opinion on this CPU which I will magnanimously give you if you send one my way 'cus that's just the kind of guy I am.devil

    Long time....:)

     

     

     

     

  • kyoto kidkyoto kid Posts: 41,847
    ghosty12 said:
    kyoto kid said:
     

    The decrease in video RAM in a minor impact that will effect less than 5% of the renders you'd ever consider doing and even for those it's an easy work around for you to simply do compositing.

    You don't see the real problem. Buying a Video Card with 11 GB and ending up with 8.5 means you lost a few thousand dollars just because of the OS. At the current Hardware state, 8 or 11 GB for rendering is really small. That's why CPU rendering is still used in motion film. I don't think you can tell Pixar to go compositing with their assets. Memory is the biggest limiting factor and ending up with as much memory as the previous generation Hardware with your shiny new toy is at best frustrating.

    I can only advise a dual (triple) boot (I have a linux). A new disk is not expensive

    ...part of the reasoning behind my dual 8 core Xeon, 128GB quad channel DDR3 render beast which will be driven by W7 Pro. Not many scenes, even of the epic level I create, are going to go into swap mode.  Yeah, it will have a 1080 Ti ,which if it's memory exceeded, won't suffer as much given the backup horsepower this system has. 

    The other reason for it is I am looking at rendering in large scale format for gallery quality prints.

    Sooner or later you will have to upgrade your OS since official extended support for Win 7 ends in 2020 which is 2.5 years away and that is not that long.. This is probably why there is so much talk of having 2 cards in one system is useful one for display the other for rendering.. And soon enough it will be possible to have 2 full x16 slots for video cards as at the moment having 2 cards puts those slots to x8..

    ...if the system is a standalone (no connection to the Net) it will not matter. I already have disabled updating on my current W7 workstation as last October MS rolled all security updates into one monthly bundle so you are no longer able to pick and choose which file you need or want. Hence, if one of the files in the bundle has a bug, you are stuck with it until the next month's patch when they (hopefully) fix it. As is my workstation is already offline as I no longer have a hardwired connection and the system does not have wireless (by design). Yeah a pain to install new content and I no longer can install new betas, but a small price.
  • kyoto kidkyoto kid Posts: 41,847
    Bobvan said:

     Is my CPU not crazy fast as well?

    Comparatively speaking, CPU is glacial compared to GPU rendering.. so it will do in a worst-case scenario... but will be reaaaaaaaaaaaaaaaaaally slow (like hours and hours to render)

    ...not as bad as Luxrender where CPU render times are often measured in days instead of hours.
  • kyoto kidkyoto kid Posts: 41,847
    edited May 2017
    nicstt said:
     

     

    nicstt said:
    kyoto kid said:
    Bobvan said:

    It has win 10.. was not planning on changing I always liked it on my 2 machines and runs better then my old 7 laptop..

    ...W10 home is a PITA when it comes to updating and GPU memory reserving.  Again under W10 that 1080 TI will only allow you to use about 8.6 GB for rendering because of the VRAM the OS reserves,  Get an OEM of W7 Pro as W7 has almost  a negligible footprint on a GPU card's VRAM.  If you have scenes that exceed the available VRAM, Iray will dump to slower CPU mode and all those CUDA cores wil be worthless.

    I've posted in other threads that i don't usually have RAM reserved on the 980ti that isn't used for video but only rendering.

    Well, 3MB is reserved, which isn't really an issue.

    I use W10 pro.

    Opening up Studio (4.9.3.166 beta) ups it from 3MB to 50ish MB

    That's interresting. I guess you have an other GFX card used for the display. Can you detail your hardware and software setup as well as the driver versions used?

     

    I do, I use a 970 for display.

    I have w10 locked down tight, loads of stuff disabled; I check for updates (for example) when my AV suite tells me there are some, if I don't do anyway.

    I also have Nvidia Control Panel option Compute option turned on, although this doesn't seem to make a difference.

    I'm using the latest version of W10, but I had windows performing the same before the upgrade. If all goes well, in a few days I'll update my disk image to the new version.

    Nvidia Driver: 378.78

    Windows 10 Pro Version 1703, build: 15063.296

    i7 4770k (Haswell), 16MB RAM, Gigabyte Z87X UD5H

    Edit:

    added image

    ...yes, but you have W10 Pro which offers more options (like deferring updating and being able to completely disabling and even removing nuisance "features" which Home Edition doesn't permit. If I accepted the "free" upgrade offer last year, I'd be stuck with Home Edition.
    Post edited by kyoto kid on
  • Ghosty12Ghosty12 Posts: 2,080
    kyoto kid said:
    ghosty12 said:
    kyoto kid said:
     

    The decrease in video RAM in a minor impact that will effect less than 5% of the renders you'd ever consider doing and even for those it's an easy work around for you to simply do compositing.

    You don't see the real problem. Buying a Video Card with 11 GB and ending up with 8.5 means you lost a few thousand dollars just because of the OS. At the current Hardware state, 8 or 11 GB for rendering is really small. That's why CPU rendering is still used in motion film. I don't think you can tell Pixar to go compositing with their assets. Memory is the biggest limiting factor and ending up with as much memory as the previous generation Hardware with your shiny new toy is at best frustrating.

    I can only advise a dual (triple) boot (I have a linux). A new disk is not expensive

    ...part of the reasoning behind my dual 8 core Xeon, 128GB quad channel DDR3 render beast which will be driven by W7 Pro. Not many scenes, even of the epic level I create, are going to go into swap mode.  Yeah, it will have a 1080 Ti ,which if it's memory exceeded, won't suffer as much given the backup horsepower this system has. 

    The other reason for it is I am looking at rendering in large scale format for gallery quality prints.

    Sooner or later you will have to upgrade your OS since official extended support for Win 7 ends in 2020 which is 2.5 years away and that is not that long.. This is probably why there is so much talk of having 2 cards in one system is useful one for display the other for rendering.. And soon enough it will be possible to have 2 full x16 slots for video cards as at the moment having 2 cards puts those slots to x8..

     

    ...if the system is a standalone (no connection to the Net) it will not matter. I already have disabled updating on my current W7 workstation as last October MS rolled all security updates into one monthly bundle so you are no longer able to pick and choose which file you need or want. Hence, if one of the files in the bundle has a bug, you are stuck with it until the next month's patch when they (hopefully) fix it. As is my workstation is already offline as I no longer have a hardwired connection and the system does not have wireless (by design). Yeah a pain to install new content and I no longer can install new betas, but a small price.

    Ahh k well if the system is isolated from the outside world then yeah there is no need for a newer OS, as for why I went to 10 is that my version of Win 7 is the Home version and as such I am limited to 16gig of ram which is a right pain in the rear..

  • Subtropic PixelSubtropic Pixel Posts: 2,388

    There's been a lot said in these 4 pages of posts, and I think one of the most important points is being lost in the forest.  So, just to emphasize the main one that I see:  Building any new machine with Windows 7 (or even Windows 8.1) is not the best path.

    Kyoto Kid has a special situation.  Given his processing requirements, available funding, and available spare time, he has decided to accept the risks of doing it his way.  That's his choice.  But I will say that he probably has to spend more time than he realizes just to make all this work and to keep it all working; babying his environment along the way.  Even if that's not the case, he most likely has to accept some fairly significant drawbacks. 

    For example, keeping that computer off the internet and accepting the idea that you can't use it for multiple purposes (researching for the piece you're working on, for example) or that you will never be able to upgrade it.  I could never accept such limitations in this age of modern computing.  I didn't even do that in 1998 when I first started using Cubase to compose and record music, and back then we didn't even have auto-updating systems and the WWW still had a lot of gaps in usability and information availability.

    Everybody should also note that some of Intel's and AMD's newest processors won't boot with obsolete versions of the Windows OS.  Welcome to the new world.  They'll work with Linux, etc; but Windows 10 will be the only Windows that will work.  Maybe you can find a way to circumvent that, but then you'll have to accept even more limitations in your system, as well as risks that one day your circumvention may be eliminated.  If you're using your computer to put food on your family's table, that would be an unacceptable risk, I would think.  In my opinion, making DAZ Studio "Linuxable" is not the solution either, though I won't go into that here because it's outside of the scope of my point.

    In the end, every thinking person has to ask himself or herself if all this trouble, effort, and reduced functionality really is an efficient/sensible use of their time, just to keep their computing system running under old precepts. 

    Maybe the answer is "yes" for some.  But not for me, and probably not for anybody who is using their computer for mission critical work, and almost definitely not for any Generation Z person (who's oldest are now in their early 20s) getting started in this activity, whether for work or play.

    Kyoto Kid, I'm not picking on you.smiley  But I'm working with more and more younger people in my day job and I've observed a lot of big differences in how they think, how they work, and what they expect from their computing systems.  The common thread I see is that living with self-made computing limitations (such as making a computer not be able to go online) is not desirable to them.  I've had to rethink many of my own preconceptions in the last 6 months or so because of this; hence this digression of a post.

    Thank you all for your consideration of this.

  • BobvanBobvan Posts: 2,653
    edited May 2017

    Just like I am being told by some that adding the 980 to run my display could possibly lead to conflicts.. Does not really bother me since I can so other things with ROG laptop

    Post edited by Bobvan on
  • nicsttnicstt Posts: 11,715
    kyoto kid said:
    nicstt said:
     

     

    nicstt said:
    kyoto kid said:
    Bobvan said:

    It has win 10.. was not planning on changing I always liked it on my 2 machines and runs better then my old 7 laptop..

    ...W10 home is a PITA when it comes to updating and GPU memory reserving.  Again under W10 that 1080 TI will only allow you to use about 8.6 GB for rendering because of the VRAM the OS reserves,  Get an OEM of W7 Pro as W7 has almost  a negligible footprint on a GPU card's VRAM.  If you have scenes that exceed the available VRAM, Iray will dump to slower CPU mode and all those CUDA cores wil be worthless.

    I've posted in other threads that i don't usually have RAM reserved on the 980ti that isn't used for video but only rendering.

    Well, 3MB is reserved, which isn't really an issue.

    I use W10 pro.

    Opening up Studio (4.9.3.166 beta) ups it from 3MB to 50ish MB

    That's interresting. I guess you have an other GFX card used for the display. Can you detail your hardware and software setup as well as the driver versions used?

     

    I do, I use a 970 for display.

    I have w10 locked down tight, loads of stuff disabled; I check for updates (for example) when my AV suite tells me there are some, if I don't do anyway.

    I also have Nvidia Control Panel option Compute option turned on, although this doesn't seem to make a difference.

    I'm using the latest version of W10, but I had windows performing the same before the upgrade. If all goes well, in a few days I'll update my disk image to the new version.

    Nvidia Driver: 378.78

    Windows 10 Pro Version 1703, build: 15063.296

    i7 4770k (Haswell), 16MB RAM, Gigabyte Z87X UD5H

    Edit:

    added image

     

    ...yes, but you have W10 Pro which offers more options (like deferring updating and being able to completely disabling and even removing nuisance "features" which Home Edition doesn't permit. If I accepted the "free" upgrade offer last year, I'd be stuck with Home Edition.

    the method i use for controlling updates worked when i tried W10 home on my laptop; i removed w10 from laptop for other reasons.

     

  • kyoto kidkyoto kid Posts: 41,847
    edited May 2017

    ...being a little "old school" (began working with computers in the days of punched cards and punched tape) true, I most likely am an exception to the mainstream.  I'm used to ""patching" things together and making them work.  Back in the late 80s a friend and I actually managed networking a couple Mac 512s together (which was thought couldn't be done). In my old job of multimedia development I backwards engineered the process for formatting typsetting tapes.  Once the programme was run (usually in batch over a number of files) it converted all the typesetting codes to make the text on the screen to appear almost exactly as it was in hard copy pubclication.  Only a single cleanup/QC pass was requried. Previously this was all laboriously done by hand and required multiple QC passes.  I am used to starting with little and having to make things work.

    True, budget is one of my major constraints so I need to make the best out of what I can afford. For our purposes there is little difference between DDR3 and DDR4 (DDR3 can also be configured in quad channel mode).  Two Sandy Bridge 8 core Xeons (total of 32 CPU threads) cost less than a single 10 core i7 (and far less than the forthcoming i9 series 12 core).  Yeah they may not be quite as fast (turbo to 3.5GHz), but Xeon architecture is more efficient for heavy computational operations like rendering involves.  DirectX 12 may be great if you are a gamer, I am not. 

    The sole purpose of the system I am designing will be to set up and render scenes in large format output.  I still have my production workstation, as well as two notebooks, one of which I use for Net access.  So for the "other" computing purposes, I am covered (this way I can be working on a scene, rendering another, while answering emails performing research or watching a film without disrupting any of the tasks I am performing). 

    W7 works just fine for my needs.  I don't need an OS to tell me the weather, time of day, the daily stock quotes, headlines, that there's an incoming call, or what color of shirt I need to wear that day.  Others may or do need that, and for them, that's fine.  For my needs, a computer is a tool, a machine for getting tasks done.

    Moving to Linux would still be the best solution, but agreed, Daz and most other small software companies do not have the development resources to rewrite and support yet another OS version of their software so it may take some creative solutions on my end.

    If I had the income to support it, I would probably consider more SOTA hardware, most likely moving to 8.1 (with a third party desktop utility to get rid of those ugly tiles) which offers better memory support (512 GB for the Pro version).  All MS really needed to do was improve the kernel and leave pretty much everything else alone instead of forcing the notion of "experiences" on us and treating power users like they knew nothing about maintaining a secure system.  Again an OS is a set of instruction codes that allow a computer to interact with and run software.  That is all it should be. If people want fancy bells & whistles, those should be made available in the MS app store. not integrated into the base OS.

    Post edited by kyoto kid on
  • Subtropic PixelSubtropic Pixel Posts: 2,388
    edited May 2017
    kyoto kid said:

    ...an OS is a set of instruction codes that allow a computer to interact with and run software.  That is all it should be. If people want fancy bells & whistles, those should be made available in the MS app store. not integrated into the base OS.

    Your definition of an OS is factually correct, of course. 

    But the world has changed its perception of the meaning of what an OS is.  It has changed from the instruction-set meaning that you say it is to more of a "platform indicator".  Maybe the language will go back, but probably not in the working lifetime of anybody older than 40. 

    Folks in their 40s and above can decide to resist the evolution of syntax and they'd be factually correct in doing so.  But does it help you communicate with those younger than you?  Will that get you to a point of understanding of the world around you, as perceived by the common, nontechnical person, or even the young person versed in new technologies without having ever compiled a COBOL or C++ program in their life?

    To a lot of people Windows contains everything on their C drive.  Because they need Windows and they need all the stuff they installed on their C drive.  So THAT'S Windows to them.  Are you right when you say their C drive is more than just Windows?  Yes.  Are they wrong when they say no, their whole C drive is Windows, because it's THEIR Windows?  No. 

    More than being right, what we need is a meeting of the minds, a common understanding of what things mean and how they work in the real world.  Actually, we kind of have that already.  You mostly know what I mean when I say "I'm on Windows", versus "I'm on Mac".  angel

    Next week, I'll be giving a lecture on how Virtual Storage has been present since the 8 bit machines of the 1960s, including a dramatic story on the disk access methods used for it.  Afterward, we'll have a cookie break for the signing of an online petition to finally make Global Virtual Storage Day (December 32nd) an official paid holiday for all.  Anybody interested?  cheeky

    Post edited by Subtropic Pixel on
  • kyoto kidkyoto kid Posts: 41,847
    edited May 2017
    ...again for my purposes, a computer is little more than a tool to accomplish tasks with, just as paints, brushes, pencils are for working in traditional art forms. This is why I approach the subject of hardware, OS, and software from the perspective I do. I would rather communicate and tell a story through the images I create, just as I did when I could still paint and draw. How I achive the end result whether I use "legacy" or SOTA hardware/platforms is immaterial. A hammer does the same job as a pneumatic nail gun, maybe not as fast, but just as good in skilled hands
    Post edited by kyoto kid on
  • artphobeartphobe Posts: 97

    Congratz on the system... can you please do a test

    Run a GPU only render (do mention how many GPUs are running)

    Run a CPU + GPU render

    Report how much time was shaved off with the 20 threads being used too

    I want to see how much of an impact does the CPU have considering 16+ thread systems will be the norm pretty soon in enthusiast machines

  • BobvanBobvan Posts: 2,653

    Sorry how do I actually mesure? All I can tell you is something like this used to take sometimes over 12 hours now took only over 2?

    69.png
    1860 x 950 - 2M
  • boisselazonboisselazon Posts: 458

    well, on the dark side here is the bad news: the i9 7980xe is just out there, soon, and not so much more expensive, with an amazing 18 core (36 threads)....

  • BobvanBobvan Posts: 2,653
    edited June 2017

    well, on the dark side here is the bad news: the i9 7980xe is just out there, soon, and not so much more expensive, with an amazing 18 core (36 threads)....

    There is always something.. Like I mentioned I was already having my system built when I saw the post on this processor. Glad I went with it.  I get to cranks out lots more renders :)  I still get to use my ROG Laptop to build and set up scenes as well..

     

    BTW I had originally put in a 4 yr old SSD drive but then changed it to a newer one leaving the old one as storage. Runs a bit faster and not as hot..

    Post edited by Bobvan on
  • artphobeartphobe Posts: 97
    edited June 2017

    Run that render (smaller size maybe so it completes in 10mins)

    When the render is done, move your mouse to the taskbar and alternate between the render and main daz studio window. When you move it to the daz studio window, it will show the time it took to complete the render.

    bascially run the render twice: once with GPU and once with GPU+CPU

     

    i just want to see the impact of the CPU (if at all) when it is used in rendering alongside your GPU

    Post edited by artphobe on
  • staigermanstaigerman Posts: 236
    edited June 2017
    Bobvan said:

    Man this thing flies!

    Hi Bob, how fast is it with Puppy Ray on Howler 11? If you don't have it I'd like to trade you an NFR copy in exchange for a few test renders. Or if you don't mind, try the free demo?  Contact me please. I'm curious if the 1080 does 'Final Render' quality (35 render passes) in realtime (30 fps) on a 1280x720 scene with refractive waterplane enabled, cast shadows enabled (quality=8 for soft shadows), bump map enabled (Rocky or other), and the fog at level 6-9 for peeking into the farther reaches with additional tilings of the surface patch. 

     

    I heard of the new generations of i7 last summer, have been eagerly awaiting their arrival. 10 cores at hyperthreading that's 20 gogical cores. Howler can go up to 64. You can still improve :-)

    Where did you buy it, did you assemble it yourself? I want something with that chip on a laptop eventually.

    Anyway, I am very very VERY(!) jealous :-)

    Let me know if you can spare some time to test render Puppy Ray?

    added:  gnore my questions about config and where you got it, I just read the other details in this thread.

    -Philip

     

    Post edited by staigerman on
  • staigermanstaigerman Posts: 236
    JamesJAB said:

    On some setups with high end GPUs or multiple GPUs adding the CPU into the mix can slow render times.  With your 20 CPU threads and quad channel memory (hopefully you are running 4 matching sticks) that should be fast enough to give your render times a boost.  It will be up to you to decide if there is enough of a boost to justify the added ectricity cost from maxing out a 140w CPU.

    I get a pretty decent boost adding my dual Xeon X5570 CPUs (quad core i7 based xeons) in with my GTX 1060

     

    That's interesting to know that it's best to have 4 memory banks loaded, thanks. So 4x8GB = 32GB is better than 2x16GB, also 32GB?  Is that by a small margin or by a big difference? If all things considered are the same, same clockspeed of the 2x16 option vs. the 4x8, are we looking at an increased rendering speed potential of more than 10%? 20-30%? depends?

    Is it also beneficial when using the GPU, or is that pre cisely when it's the most beneficial?

     

  • BobvanBobvan Posts: 2,653

    Im fine with my workflow im not sure what Puppy Ray on Howler 11 I can run a few test renders NP

  • staigermanstaigerman Posts: 236
    Bobvan said:

    Im fine with my workflow im not sure what Puppy Ray on Howler 11 I can run a few test renders NP

    oops, sorry, I assumed everyone knew by now. lol. Well there's v10 at Daz but I'm working on getting v11 here too,

    Puppy Ray is a ray casting rendering filter in Dogwaffle's PD Howler and PD ARtist (without animation).

    http://www.thebest3d.com/puppyray/

    http://www.thebest3d.com/howler/11/new-in-version-11-puppyray.html

     

     

     

    you can try the demo here:

    http://www.thebest3d.com/dogwaffle/demozone

    Wait, let me try to PM you for more options

     

     

     

  • BobvanBobvan Posts: 2,653
    edited June 2017

    Looks interesting just busy with projects at the moment, I may get into this more with you if you dont mind waiting a bit.

    Post edited by Bobvan on
  • Takeo.KenseiTakeo.Kensei Posts: 1,303

     

    ghosty12 said:
    Bobvan said:

     Is my CPU not crazy fast as well?

    Comparatively speaking, CPU is glacial compared to GPU rendering.. so it will do in a worst-case scenario... but will be reaaaaaaaaaaaaaaaaaally slow (like hours and hours to render)

    That is because Iray was designed mainly for videocards and not the rest of the system, if Nvidia did a better implementation of hybrid/combined CPU/GPU rendering (not likely to happen) then it might be a different story.. Since when you think about it placing the bulk of the work load on one component if it is not used to it will shorten it's life span quickly, like how they say to never ever completely fill a SSD as it severly shortens its lifespan..

    I'm not sure how much nVidia could do - a CPU has far fewer cores than a GPU, although they can do more, so for tasks that fit with the way a GPU works the GPU will always be faster given well-developed code.

    Theere are many problems to cope with hybrid rendering but Nvidia has tools to address them (Unified memory, GPUDirect, DirectGMA,etc) but you also need other hardware (Tesla, Quadro, Infiniband, NVLink...).

    With consumer hardware, the bottleneck will be the CPU. In hybrid mode, the CPU has two jobs : rendering and distributing ressources to GPUs and CPUs are usually not powerfull enough to do both and you also have memory bandwidth speed and latency problem, double transfert etc.. That leads to hybrid rendering being slower than pure GPU rendering as seen in the Iray benchmark thread. Given the hardware price for consumer GPU, my POV is that it's better to stick to pure GPU rendering provided everything fits into GPU memory (that's another story)

    These technologies are available in very expensive hardware like the Quadro VCA and the main focus is rather efficiency/low latency with network rendering rather than hybrid rendering

    If you look at the OpenCL side (Luxrender), there is also barely any development for Hybrid rendering. That was a target years ago when GPU weren't as powerfull as today but now there is really marginal advantage to develop that further.

    nicstt said:
     

     

    nicstt said:
    kyoto kid said:
    Bobvan said:

    It has win 10.. was not planning on changing I always liked it on my 2 machines and runs better then my old 7 laptop..

    ...W10 home is a PITA when it comes to updating and GPU memory reserving.  Again under W10 that 1080 TI will only allow you to use about 8.6 GB for rendering because of the VRAM the OS reserves,  Get an OEM of W7 Pro as W7 has almost  a negligible footprint on a GPU card's VRAM.  If you have scenes that exceed the available VRAM, Iray will dump to slower CPU mode and all those CUDA cores wil be worthless.

    I've posted in other threads that i don't usually have RAM reserved on the 980ti that isn't used for video but only rendering.

    Well, 3MB is reserved, which isn't really an issue.

    I use W10 pro.

    Opening up Studio (4.9.3.166 beta) ups it from 3MB to 50ish MB

    That's interresting. I guess you have an other GFX card used for the display. Can you detail your hardware and software setup as well as the driver versions used?

     

    I do, I use a 970 for display.

    I have w10 locked down tight, loads of stuff disabled; I check for updates (for example) when my AV suite tells me there are some, if I don't do anyway.

    I also have Nvidia Control Panel option Compute option turned on, although this doesn't seem to make a difference.

    I'm using the latest version of W10, but I had windows performing the same before the upgrade. If all goes well, in a few days I'll update my disk image to the new version.

    Nvidia Driver: 378.78

    Windows 10 Pro Version 1703, build: 15063.296

    i7 4770k (Haswell), 16MB RAM, Gigabyte Z87X UD5H

    Edit:

    added image

    Thanks. I've seen an other option in Nvidia control panel that makes a big difference in Win7 but can't remember which one.

     

    Bobvan said:

     Is my CPU not crazy fast as well?

    Comparatively speaking, CPU is glacial compared to GPU rendering.. so it will do in a worst-case scenario... but will be reaaaaaaaaaaaaaaaaaally slow (like hours and hours to render)

    Agreed. Most consumer CPU will only give 0,2 to 0,5 TFlops rendering power. Even the lowest 70$ Nvidia geforce 10 serie (GTX 1030) will get you almost 1 TFlop calculation power. For comparison, Intel is just going to release the first Tera Flop consumer CPU http://www.zdnet.com/article/intel-unveils-monster-18-core-core-i9-first-teraflop-speed-consumer-pc-chip/

    For rendering, GPU have the price per performance advantage as well as the power edge too (12 Tflops with Titan Xp )

     

    Everybody should also note that some of Intel's and AMD's newest processors won't boot with obsolete versions of the Windows OS.  Welcome to the new world.  They'll work with Linux, etc; but Windows 10 will be the only Windows that will work.  Maybe you can find a way to circumvent that, but then you'll have to accept even more limitations in your system, as well as risks that one day your circumvention may be eliminated.  If you're using your computer to put food on your family's table, that would be an unacceptable risk, I would think.  In my opinion, making DAZ Studio "Linuxable" is not the solution either, though I won't go into that here because it's outside of the scope of my point.

    I think you should carefully read the article you link to : Not (officially) supported Hardware is different from Not booting. Windows 7 (unofficially) works with Ryzen CPU and I have no doubt it also (unofficially) works with Kaby Lake CPU. And from a lot of threads that you can easily find with google, till now, many people see more drawbacks and limitations with Windows 10 than Windows 7. If Windows 10 was so good there wouldn't have any debate. If Windows 10 was so good, MS wouldn't have to block updates or use other not so nice tricks to force Windows 10 adoption.

    I also find that you exagerate the risks, efforts etc to continue using Windows 7 (or any older Windows version). Any new OS (I include Linux) means, in my case, at least one month of work to get it running the way I want vs few hours with Win 7.  If you are that parano, you can make a whole system backup every week or even everyday automatically. No need to disconnect from internet.

    The only big functionnality reduction I can see is that you can't use DX12 on Win7. Unless you are a professionnal gamer (and even then) , there are very few Win 10 feature that are a 'must have' right now.

    You also forget that the current software ecosystem is still Win7 compatible and few are using Win 10 hypothetical advanced feature (again DX12 for gaming and ? ).

     

    rInside the DAZ community the only questions are :

    - does DS work fine on Win 7? Answer is Yes

    - does DS perform better on Windows 10 than Win 7 ? Answer is No

    - On which OS do you have more drawbacks using DS ? Answer is Windows 10 (Locked GPU memory for many users, automatic reboot during renders...)

    OS are the base layer to access final functionnality/feature. They are not the primary goal. If an OS "upgrade" degrades your current use, that is not what I call an upgrade unless that is a concious choice

    If your primary goal really was Security and Updates, then Linux is more secure and you can get updates for any new HW. Some people are running DS on linux at the price of other drawbacks that they are aware of

    The OS choice can be extended to Linux, MacOS, BSD and any other if it fits your needs. Knowing each ones advantage and drawback and making a well informed choice is better than a forced update

    In the end, every thinking person has to ask himself or herself if all this trouble, effort, and reduced functionality really is an efficient/sensible use of their time, just to keep their computing system running under old precepts. 

    Maybe the answer is "yes" for some.  But not for me, and probably not for anybody who is using their computer for mission critical work, and almost definitely not for any Generation Z person (who's oldest are now in their early 20s) getting started in this activity, whether for work or play.

    Kyoto Kid, I'm not picking on you.smiley  But I'm working with more and more younger people in my day job and I've observed a lot of big differences in how they think, how they work, and what they expect from their computing systems.  The common thread I see is that living with self-made computing limitations (such as making a computer not be able to go online) is not desirable to them.  I've had to rethink many of my own preconceptions in the last 6 months or so because of this; hence this digression of a post.

    I'm not picking on you either but I don't like the MS propaganda you seem to agree with. I do understand their economical position, but If they wanted a wider Windows 10 adoption, they should have done it better. That is as simple as that. Windows 2000, XP and Windows 7 in their time received well deserved praise and adoption since MS did a good job back then. If Windows 10 doesn't get it despite of the free upgrade, there is a reason and that is not just a matter of 'old precept' or anything like old vs young

    I personally also work with lots of 'young' people and I don't know how you work with them but you seem to priviledge 'exclusive thinking'. In my POV confrontation is good because that's how you can advance in the world, but not exclusiv thinking (the MS way with Win 10). What counts is to reach a consensus by the way of compromises (not propaganda and bad tricks)

    If you don't see Win 10 drawbacks, good for you, but any potential new Win 10 user should know in advance which disavantage/problem he/she will get

    What I advise is to run Win 7 and Win 10 in parallel and that is in my POV the best choice (disks are cheap). You don't have to exclude one or the other at the beginning

  • BobvanBobvan Posts: 2,653

    To a laymen like me. I dont need 15 to 24 hours like I did from time time anymore..

  • kyoto kidkyoto kid Posts: 41,847

    Takeo.Kensei:

    ...good points.  Pretty much all the hype I have seen is "W7/8.1 will not work on Ryzen/Kaby Lake".  nice to know that it can be done.I am unconcerned with receiving MS updates for W7 since October of last year when they rolled all security updates in to one bundle to prevent users form "cherry picking".  Hence, I disabled updating altogether. My system runs fine, is still clean and made it through the "wannacry" attacks with no trouble (thanks to the AV software and firewall setup I have). 

    As to running Daz on Linux.  Currently the only ways to do so requires using Wine which as I have been reading on another thread, can be very hit or miss (as well as very frustrating). Some of the ongoing issues include unstable control response, plugins not working properly (or at all), and getting Smart Content to work (I don't use the latter).  The only real fix would be full native support for Linux, which as I mentioned earlier, most likely will not happen given that Daz is a small company and their development team has its hands full just with maintaining, testing, and updating their Flagship Studio programme.  Also plugin vendors would need to provide Linux versions as well, many of whom are individual creators. 

    A Linux version would be nice but it just isn't going to happen.

  • BobvanBobvan Posts: 2,653
    edited June 2017

    Ok ran sickleyield benchmark scene 1 min 59 seconds GPU 1 Min 53 seconds CPU plus GPU

     

    Only thing is there is still quite a bit of noise for my taste..

    t 1.png
    400 x 520 - 211K
    Post edited by Bobvan on
  • JamesJABJamesJAB Posts: 1,766
    edited June 2017
    Bobvan said:

    Ok ran sickleyield benchmark scene 1 min 59 seconds GPU 1 Min 53 seconds CPU plus GPU

     

    Only thing is there is still quite a bit of noise for my taste..

    That benchamark is not meant to make a pretty render, It's so you can compare render performance between different hardware setups.

    And so you can quickly see like you found that adding your CPU into the mix only shaved off 6 seconds from the render.

    And that your GPU did the render in less than half the time that my GTX 1060 did it in.

    Post edited by JamesJAB on
  • BobvanBobvan Posts: 2,653
    edited June 2017
    JamesJAB said:
    Bobvan said:

    Ok ran sickleyield benchmark scene 1 min 59 seconds GPU 1 Min 53 seconds CPU plus GPU

     

    Only thing is there is still quite a bit of noise for my taste..

    That benchamark is not meant to make a pretty render, It's so you can compare render performance between different hardware setups.

    And so you can quickly see like you found that adding your CPU into the mix only shaved off 6 seconds from the render.

    And that your GPU did the render in less than half the time that my GTX 1060 did it in.

    Good point it does go faster. I think I have cranked out something like 90 renders since I got it or something around that.. and thats with the machine off at night and while I am at work..

    Post edited by Bobvan on
  • AJ2112AJ2112 Posts: 1,417
    Bobvan said:

    Ok ran sickleyield benchmark scene 1 min 59 seconds GPU 1 Min 53 seconds CPU plus GPU

     

    Only thing is there is still quite a bit of noise for my taste..

    Freakin awesome Bob !!  Your system is smokin fast !  People have to do what's best for them, regardless of opinions.  Enjoy your new system bud yes

Sign In or Register to comment.