Adding to Cart…
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.
Comments
Hi Bob, typing on my Xbox controller, Lol. In my 50's to. Card just arrived. Kits are a beauty friend
Will reply more later.
Thanks I hope the Tinntus gets better.. Either way got the nice rig out of it..
Im not sure about win 7 mutli cards ect. Im not that techy. Im just happy that it cut down on time quite a bit. Its awsome to see the knowledge some of you have..
I have the Home Ed of Win 10 and while it will auto download any updates you can turn off the auto shutdown and restart bit.. Also with Win 7 main stream support ended 2 years ago and extended support ends in 2020..
OT That sounds like one beast of a machine, I too sometime this year hope to get a new computer planning on going all out on it too, to a point that is.. Deciding on whether to go with a Skylake CPU or Kabylake CPU.. :)
Im enjoying it and at the end of the day I think that what matters.. I am fortunate to afford it this is not lost on me..
And that is the main thing in all this, is that you are enjoying the new beastly computer.. :) Just looking at the specs now can see why the extra cooling that CPU has a rather toasty TDP.. :) On that had a look at what is coming and well there is a supposed 32 Core behemoth coming from Intel called Skylake Purley and looks to be a Xeon processor and likely cost the GDP of a small country to buy.. :)
Other speculations are that it will come in a LGA 3647 socket size and will support 6 channel DDR 4 now that would be one computer to have though expecting what the price will be well yes not likely to happen anytime soon.. lol
LOL it does get pricy don't it. But like my boss said when I told him I was able to do it due to seeling some of my drums. Swicthed one toy for another.
Lol yeah it can be like that, did some more digging and found out some more info on what is coming and got a surprise there is supposedly an all new Core I9 on the way.. These guys never let up with the new stuff makes it so frustrating.. You just buy a shiny new computer and they up the ante by releasing awesome new stuff.. lol
That's one thing I figured the next processor must be coming.. I think phones are the worse TV's ect my 1 yr old nano crystal Samsung 4K is already outdone by O & Qled..
Finally challenged the beast to need longer with inside of a cave with a dark flame emission on a plane few hours running looking pretty good something that would of taken 12 to 15 hours prior..
...part of the reasoning behind my dual 8 core Xeon, 128GB quad channel DDR3 render beast which will be driven by W7 Pro. Not many scenes, even of the epic level I create, are going to go into swap mode. Yeah, it will have a 1080 Ti ,which if it's memory exceeded, won't suffer as much given the backup horsepower this system has.
The other reason for it is I am looking at rendering in large scale format for gallery quality prints.
...AMD's forthcoming 32 core (64 thread) Epyc will support 28 PCI 3.0 lanes and 8 memory channels per socket, so a dual socket system will support 16 memory channels. Granted it is designed with data centres in mind so only Linux will run on on it, However, if a future version of Daz, supports Iray networked rendering, that would be the ultimate beast.
Time to get that Megabucks lotto ticket.
I'd like to see how it would handle Carrara as Carrara will make use of as many CPU cores/threads you can throw at it. That could be one fun little "light show" with 128 coloured tiles flashing across the screen for a few minutes.
Sooner or later you will have to upgrade your OS since official extended support for Win 7 ends in 2020 which is 2.5 years away and that is not that long.. This is probably why there is so much talk of having 2 cards in one system is useful one for display the other for rendering.. And soon enough it will be possible to have 2 full x16 slots for video cards as at the moment having 2 cards puts those slots to x8..
Ok let me see if I am getting this. Win 10 is taking 3 out of my 11G from my card. If my scene exceeds 8 G it will use my CPU. Is my CPU not crazy fast as well? Sorry as previously mentioned I am not as technical as some of you :)
I believe the lost 3GB is a potential loss, and will vary depending on what other display options you have set etc. As such the loss is likely to vary from box to box, and you can only be sure how much VRAM is available by testing it yourself on your own machine. I would be interesting in hearing the results that you get as I myself want to buy a 1080Ti, and I would prefer to hear what people are seeing that actually own this graphics card rather than the theories from those that do not.
All I know is for what used to take longer like 15 to 20 hour range is now takes 3 to 4. Otherwise it's 30 minutes or less. Huge improvement all the same..
Which indicates that the better the hardware one has the quicker things become as you yourself have seen, and being that you have pretty much top of the pile at the moment in hardware there are many many factors we are looking at.. Tthe OS is one part of it the other is how much ram you have the speed of your harddrives/SSD's, the quality of your videocard and so on and of course the most important part the quality of the motherboard for which none of the rest of the components would not work..
Doing a google search the only hits I seem to find deal with system ram on all OS's from win 7 to 10..
Yeah I would like to see that as well as at the moment I am not finding to many google hits in video card memory being reserved.. Did find a way to limit the amount of reserved system ram that any Win OS uses..
Comparatively speaking, CPU is glacial compared to GPU rendering.. so it will do in a worst-case scenario... but will be reaaaaaaaaaaaaaaaaaally slow (like hours and hours to render)
That is because Iray was designed mainly for videocards and not the rest of the system, if Nvidia did a better implementation of hybrid/combined CPU/GPU rendering (not likely to happen) then it might be a different story.. Since when you think about it placing the bulk of the work load on one component if it is not used to it will shorten it's life span quickly, like how they say to never ever completely fill a SSD as it severly shortens its lifespan..
My SSD sits mostly empty. I keep this machine petty bare bones and put all the stuff on my ROG laptop.. My previous system ran for 4 years just had to replace 1 minor fan.. A few thousand renders.. Another example of the attached render is lines in the woman's faces took like 13 hours to clear up then and I cheated a bit with PS tools to smooth it out. This is no longer the case..
this is why I reckon the better the overall hardware one has the better and faster renders happen, as you are finding out now and that I hope to be able to do soon..
I'm not sure how much nVidia could do - a CPU has far fewer cores than a GPU, although they can do more, so for tasks that fit with the way a GPU works the GPU will always be faster given well-developed code.
Im getting some ideas. In any case 12 to 20 hours seem to be a thing of the past. Even more so when I used to Reality / Lux. Perhaps I will blow the dust of Reality to see what it could do with this rig will see.. Im fine with iray. Requirements I am sure will change when Studio and content advance, will most likely demand more resources. Nothing is forever. So may as well just enjoy...
I suppose more of a case of having a better implementation of Iray being able to use the systems main memory more effectively than having to rely solely on the graphic cards memory for GPU rendering.. So while the GPU can do the work with some help from the CPU the GPU can gain full advantage of all available memory in the system to allow people to have more complex scenes..
Will be interesting to see what Reality / Lux run luck under a system like that.. :)
Im still on the fence due to all the glitches and workarounds. I dont miss them..
Just reading my post that you quoted and I am trying to work out why I wrote luck in my reply as it does not make on bit of sense.. lol But about Reality/Lux unfortunately it is buggy which is a pain as they are pretty good in what they do.. :(
I do, I use a 970 for display.
I have w10 locked down tight, loads of stuff disabled; I check for updates (for example) when my AV suite tells me there are some, if I don't do anyway.
I also have Nvidia Control Panel option Compute option turned on, although this doesn't seem to make a difference.
I'm using the latest version of W10, but I had windows performing the same before the upgrade. If all goes well, in a few days I'll update my disk image to the new version.
Nvidia Driver: 378.78
Windows 10 Pro Version 1703, build: 15063.296
i7 4770k (Haswell), 16MB RAM, Gigabyte Z87X UD5H
Edit:
added image
Im looking back at some of my lux stuff using R4 and really liked the results so I may give a it a try.. Will keep you posted if / when I do..
found the usa site
http://www.ncixus.com/go/?ncix-pc#start&CFID=25329&CFTOKEN=87B9FBE4-8D02-4608-988D5B8CB88D18DD
Cool mine is local 5 mins from my place..
is nice shops like ncixus still exist.
my carrara rendering needs are lil different,
i don't need to spend money on uber graphics card, internet wifi and bluetooths,
cpu speed and ram is what i'd rather spend my savings, and the cooling.
I'm scared to build myself cause of the goop on the cpu for the heatsink.
doing it wrong = 2k$ chip up in smoke