Just got my new i7-6950X (10-core CPU) 1080ti

135

Comments

  • AJ2112AJ2112 Posts: 1,417
    edited May 2017
    Bobvan said:
    Awesomefb said:

    Bob, what type of drums were you jammin on ?  I'm a drummer to, have a 9 piece Tama granstar kit and Roland TD-25 E kit.  I'll be off computer, installing new Gtx 1080, will be back this evening. 

    Im in my 50's now I used to do it more pro years ago sold DW collector's kit. I also have a Roland TD 50 depending on how the Tinnitus resolves itself of not it may go too, will see.. Barely slept in the past month im suprised I can still function.. I was not really using the A kit that much will be using the machine alot more..

    Hi Bob, typing on my Xbox controller, Lol.  In my 50's to.  Card just arrived.  Kits are a beauty friend yes   Will reply more later.

    Post edited by AJ2112 on
  • BobvanBobvan Posts: 2,653
    edited May 2017
    Awesomefb said:
    Bobvan said:
    Awesomefb said:

    Bob, what type of drums were you jammin on ?  I'm a drummer to, have a 9 piece Tama granstar kit and Roland TD-25 E kit.  I'll be off computer, installing new Gtx 1080, will be back this evening. 

    Im in my 50's now I used to do it more pro years ago sold DW collector's kit. I also have a Roland TD 50 depending on how the Tinnitus resolves itself of not it may go too, will see.. Barely slept in the past month im suprised I can still function.. I was not really using the A kit that much will be using the machine alot more..

    Hi Bob, typing on my Xbox controller, Lol.  In my 50's to.  Card just arrived.  Kits are a beauty friend yes   Will reply more later.

    Thanks I hope the Tinntus gets better.. Either way got the nice rig out of it..

     

    Post edited by Bobvan on
  • BobvanBobvan Posts: 2,653
    edited May 2017

    Im not sure about win 7 mutli cards ect. Im not that techy. Im just happy that it cut down on time quite a bit. Its awsome to see the knowledge some of you have..

    Post edited by Bobvan on
  • Ghosty12Ghosty12 Posts: 2,080
    edited May 2017
    kyoto kid said:

    ...well for one as you have only two memory sticks you cannot take advantage of quad channel mode which requires 4 sticks minimum.

    No SSD? That would have sped  boot up and programme loading.  My next design uses two SSDs one for the boot and the other for my library drive.  The HDDs I included are only used for backup/storage.

    I would suggest digging up the extra 139$ for a W7 Pro OEM, as like I mentioned W10 reserves memory leaving you with just under 9 GB available for rendering.  Also W10 home pretty much puts your system in MS's control with regards to updating.  You could be working on a scene or rendering and *pow* everything is lost because the system auto reboots, and if it is a major update (not just a monthly patch) it could be unusable for an hour or more.

    I have the Home Ed of Win 10 and while it will auto download any updates you can turn off the auto shutdown and restart bit.. Also with Win 7 main stream support ended 2 years ago and extended support ends in 2020..

    OT That sounds like one beast of a machine, I too sometime this year hope to get a new computer planning on going all out on it too, to a point that is.. Deciding on whether to go with a Skylake CPU or Kabylake CPU.. :)

    Post edited by Ghosty12 on
  • BobvanBobvan Posts: 2,653

    Im enjoying it and at the end of the day I think that what matters.. I am fortunate to afford it this is not lost on me..

  • Ghosty12Ghosty12 Posts: 2,080
    edited May 2017

    And that is the main thing in all this, is that you are enjoying the new beastly computer.. :) Just looking at the specs now can see why the extra cooling that CPU has a rather toasty TDP.. :)  On that had a look at what is coming and well there is a supposed 32 Core behemoth coming from Intel called Skylake Purley and looks to be a Xeon processor and likely cost the GDP of a small country to buy.. :)

    Other speculations are that it will come in a LGA 3647 socket size and will support 6 channel DDR 4 now that would be one computer to have though expecting what the price will be well yes not likely to happen anytime soon.. lol

    Post edited by Ghosty12 on
  • BobvanBobvan Posts: 2,653
    edited May 2017

    LOL it does get pricy don't it. But like my boss said when I told him I was able to do it due to seeling some of my drums. Swicthed one toy for another.

    Post edited by Bobvan on
  • Ghosty12Ghosty12 Posts: 2,080
    edited May 2017

    Lol yeah it can be like that, did some more digging and found out some more info on what is coming and got a surprise there is supposedly an all new Core I9 on the way..  These guys never let up with the new stuff makes it so frustrating..  You just buy a shiny new computer and they up the ante by releasing awesome new stuff.. lol

    Post edited by Ghosty12 on
  • BobvanBobvan Posts: 2,653
    edited May 2017

    That's one thing I figured the next processor must be coming.. I think phones are the worse TV's ect my 1 yr old nano crystal Samsung 4K is already outdone by O & Qled..

     

    Finally challenged the beast to need longer with inside of a cave with a dark flame emission on a plane few hours running looking pretty good something that would of taken 12 to 15 hours prior..

    Post edited by Bobvan on
  • kyoto kidkyoto kid Posts: 41,847
     

    The decrease in video RAM in a minor impact that will effect less than 5% of the renders you'd ever consider doing and even for those it's an easy work around for you to simply do compositing.

    You don't see the real problem. Buying a Video Card with 11 GB and ending up with 8.5 means you lost a few thousand dollars just because of the OS. At the current Hardware state, 8 or 11 GB for rendering is really small. That's why CPU rendering is still used in motion film. I don't think you can tell Pixar to go compositing with their assets. Memory is the biggest limiting factor and ending up with as much memory as the previous generation Hardware with your shiny new toy is at best frustrating.

    I can only advise a dual (triple) boot (I have a linux). A new disk is not expensive

    ...part of the reasoning behind my dual 8 core Xeon, 128GB quad channel DDR3 render beast which will be driven by W7 Pro. Not many scenes, even of the epic level I create, are going to go into swap mode.  Yeah, it will have a 1080 Ti ,which if it's memory exceeded, won't suffer as much given the backup horsepower this system has. 

    The other reason for it is I am looking at rendering in large scale format for gallery quality prints.

  • kyoto kidkyoto kid Posts: 41,847
    ghosty12 said:

    And that is the main thing in all this, is that you are enjoying the new beastly computer.. :) Just looking at the specs now can see why the extra cooling that CPU has a rather toasty TDP.. :)  On that had a look at what is coming and well there is a supposed 32 Core behemoth coming from Intel called Skylake Purley and looks to be a Xeon processor and likely cost the GDP of a small country to buy.. :)

    Other speculations are that it will come in a LGA 3647 socket size and will support 6 channel DDR 4 now that would be one computer to have though expecting what the price will be well yes not likely to happen anytime soon.. lol

    ...AMD's forthcoming 32 core (64 thread) Epyc will support 28 PCI 3.0 lanes and 8 memory channels per socket, so a dual socket system will support 16 memory channels. Granted it is designed with data centres in mind so only Linux will run on on it, However, if a future version of Daz, supports Iray networked rendering, that would be the ultimate beast.

    Time to get that Megabucks lotto ticket.

    I'd like to see how it would handle Carrara as Carrara will make use of as many CPU cores/threads you can throw at it.  That could be one fun little "light show" with 128 coloured tiles flashing across the screen for a few minutes.

  • Ghosty12Ghosty12 Posts: 2,080
    kyoto kid said:
     

    The decrease in video RAM in a minor impact that will effect less than 5% of the renders you'd ever consider doing and even for those it's an easy work around for you to simply do compositing.

    You don't see the real problem. Buying a Video Card with 11 GB and ending up with 8.5 means you lost a few thousand dollars just because of the OS. At the current Hardware state, 8 or 11 GB for rendering is really small. That's why CPU rendering is still used in motion film. I don't think you can tell Pixar to go compositing with their assets. Memory is the biggest limiting factor and ending up with as much memory as the previous generation Hardware with your shiny new toy is at best frustrating.

    I can only advise a dual (triple) boot (I have a linux). A new disk is not expensive

    ...part of the reasoning behind my dual 8 core Xeon, 128GB quad channel DDR3 render beast which will be driven by W7 Pro. Not many scenes, even of the epic level I create, are going to go into swap mode.  Yeah, it will have a 1080 Ti ,which if it's memory exceeded, won't suffer as much given the backup horsepower this system has. 

    The other reason for it is I am looking at rendering in large scale format for gallery quality prints.

    Sooner or later you will have to upgrade your OS since official extended support for Win 7 ends in 2020 which is 2.5 years away and that is not that long.. This is probably why there is so much talk of having 2 cards in one system is useful one for display the other for rendering.. And soon enough it will be possible to have 2 full x16 slots for video cards as at the moment having 2 cards puts those slots to x8..

  • BobvanBobvan Posts: 2,653
    edited May 2017

    Ok let me see if I am getting this. Win 10 is taking 3 out of my 11G from my card. If my scene exceeds 8 G it will use my CPU. Is my CPU not crazy fast as well? Sorry as previously mentioned I am not as technical as some of you :)

    Post edited by Bobvan on
  • HavosHavos Posts: 5,574
    Bobvan said:

    Ok let me see if I am getting this. Win 10 is taking 3 out of my 11G from my card. If my scene exceeds 8 G it will use my CPU. Is my CPU not crazy fast as well? Sorry as previously mentioned I am not as technical as some of you :)

    I believe the lost 3GB is a potential loss, and will vary depending on what other display options you have set etc. As such the loss is likely to vary from box to box, and you can only be sure how much VRAM is available by testing it yourself on your own machine. I would be interesting in hearing the results that you get as I myself want to buy a 1080Ti, and I would prefer to hear what people are seeing that actually own this graphics card rather than the theories from those that do not.

  • BobvanBobvan Posts: 2,653
    edited May 2017

    All I know is for what used to take longer like 15 to 20 hour range is now takes 3 to 4. Otherwise it's 30 minutes or less. Huge improvement all the same..

    Post edited by Bobvan on
  • Ghosty12Ghosty12 Posts: 2,080
    edited May 2017
    Bobvan said:

    All I know is for what used to take longer like 15 to 20 hour range is now takes 3 to 4. Otherwise it's 30 minutes or less. Huge improvement all the same..

    Which indicates that the better the hardware one has the quicker things become as you yourself have seen, and being that you have pretty much top of the pile at the moment in hardware there are many many factors we are looking at.. Tthe OS is one part of it the other is how much ram you have the speed of your harddrives/SSD's, the quality of your videocard and so on and of course the most important part the quality of the motherboard for which none of the rest of the components would not work..

    Doing a google search the only hits I seem to find deal with system ram on all OS's from win 7 to 10..

     

    Havos said:
    Bobvan said:

    Ok let me see if I am getting this. Win 10 is taking 3 out of my 11G from my card. If my scene exceeds 8 G it will use my CPU. Is my CPU not crazy fast as well? Sorry as previously mentioned I am not as technical as some of you :)

    I believe the lost 3GB is a potential loss, and will vary depending on what other display options you have set etc. As such the loss is likely to vary from box to box, and you can only be sure how much VRAM is available by testing it yourself on your own machine. I would be interesting in hearing the results that you get as I myself want to buy a 1080Ti, and I would prefer to hear what people are seeing that actually own this graphics card rather than the theories from those that do not.


    Yeah I would like to see that as well as at the moment I am not finding to many google hits in video card memory being reserved..  Did find a way to limit the amount of reserved system ram that any Win OS uses..

    Post edited by Ghosty12 on
  • Daz Jack TomalinDaz Jack Tomalin Posts: 13,811
    edited May 2017
    Bobvan said:

     Is my CPU not crazy fast as well?

    Comparatively speaking, CPU is glacial compared to GPU rendering.. so it will do in a worst-case scenario... but will be reaaaaaaaaaaaaaaaaaally slow (like hours and hours to render)

    Post edited by Daz Jack Tomalin on
  • Ghosty12Ghosty12 Posts: 2,080
    edited May 2017
    Bobvan said:

     Is my CPU not crazy fast as well?

    Comparatively speaking, CPU is glacial compared to GPU rendering.. so it will do in a worst-case scenario... but will be reaaaaaaaaaaaaaaaaaally slow (like hours and hours to render)

    That is because Iray was designed mainly for videocards and not the rest of the system, if Nvidia did a better implementation of hybrid/combined CPU/GPU rendering (not likely to happen) then it might be a different story.. Since when you think about it placing the bulk of the work load on one component if it is not used to it will shorten it's life span quickly, like how they say to never ever completely fill a SSD as it severly shortens its lifespan..

    Post edited by Ghosty12 on
  • BobvanBobvan Posts: 2,653
    edited May 2017

    My SSD sits mostly empty. I keep this machine petty bare bones and put all the stuff on my ROG laptop.. My previous system ran for 4 years just had to replace 1 minor fan.. A few thousand renders.. Another example of the attached render is lines in the woman's faces took like 13 hours to clear up then and I cheated a bit with PS tools to smooth it out. This is no longer the case..

    Untitled.jpg
    266 x 106 - 26K
    510_by_bobvan-db1kk2b.jpg
    1860 x 950 - 400K
    Post edited by Bobvan on
  • Ghosty12Ghosty12 Posts: 2,080
    Bobvan said:

    My SSD sits mostly empty. I keep this machine petty bare bones and put all the stuff on my ROG laptop.. My previous system ran for 4 years just had to replace 1 minor fan.. A few thousand renders.. Another example of the attached render is lines in the woman's faces took like 13 hours to clear up then and I cheated a bit with PS tools to smooth it out. This is no longer the case..

    this is why I reckon the better the overall hardware one has the better and faster renders happen, as you are finding out now and that I hope to be able to do soon..

  • Richard HaseltineRichard Haseltine Posts: 107,932
    ghosty12 said:
    Bobvan said:

     Is my CPU not crazy fast as well?

    Comparatively speaking, CPU is glacial compared to GPU rendering.. so it will do in a worst-case scenario... but will be reaaaaaaaaaaaaaaaaaally slow (like hours and hours to render)

    That is because Iray was designed mainly for videocards and not the rest of the system, if Nvidia did a better implementation of hybrid/combined CPU/GPU rendering (not likely to happen) then it might be a different story.. Since when you think about it placing the bulk of the work load on one component if it is not used to it will shorten it's life span quickly, like how they say to never ever completely fill a SSD as it severly shortens its lifespan..

    I'm not sure how much nVidia could do - a CPU has far fewer cores than a GPU, although they can do more, so for tasks that fit with the way a GPU works the GPU will always be faster given well-developed code.

  • BobvanBobvan Posts: 2,653
    edited May 2017
    ghosty12 said:
    Bobvan said:

    My SSD sits mostly empty. I keep this machine petty bare bones and put all the stuff on my ROG laptop.. My previous system ran for 4 years just had to replace 1 minor fan.. A few thousand renders.. Another example of the attached render is lines in the woman's faces took like 13 hours to clear up then and I cheated a bit with PS tools to smooth it out. This is no longer the case..

    this is why I reckon the better the overall hardware one has the better and faster renders happen, as you are finding out now and that I hope to be able to do soon..

    Im getting some ideas. In any case 12 to 20 hours seem to be a thing of the past. Even more so when I used to Reality / Lux. Perhaps I will blow the dust of Reality to see what it could do with this rig will see.. Im fine with iray. Requirements I am sure will change when Studio and content advance, will most likely demand more resources. Nothing is forever. So may as well just enjoy...

    Post edited by Bobvan on
  • Ghosty12Ghosty12 Posts: 2,080
    ghosty12 said:
    Bobvan said:

     Is my CPU not crazy fast as well?

    Comparatively speaking, CPU is glacial compared to GPU rendering.. so it will do in a worst-case scenario... but will be reaaaaaaaaaaaaaaaaaally slow (like hours and hours to render)

    That is because Iray was designed mainly for videocards and not the rest of the system, if Nvidia did a better implementation of hybrid/combined CPU/GPU rendering (not likely to happen) then it might be a different story.. Since when you think about it placing the bulk of the work load on one component if it is not used to it will shorten it's life span quickly, like how they say to never ever completely fill a SSD as it severly shortens its lifespan..

    I'm not sure how much nVidia could do - a CPU has far fewer cores than a GPU, although they can do more, so for tasks that fit with the way a GPU works the GPU will always be faster given well-developed code.

    I suppose more of a case of having a better implementation of Iray being able to use the systems main memory more effectively than having to rely solely on the graphic cards memory for GPU rendering.. So while the GPU can do the work with some help from the CPU the GPU can gain full advantage of all available memory in the system to allow people to have more complex scenes..

    Bobvan said:
    ghosty12 said:
    Bobvan said:

    My SSD sits mostly empty. I keep this machine petty bare bones and put all the stuff on my ROG laptop.. My previous system ran for 4 years just had to replace 1 minor fan.. A few thousand renders.. Another example of the attached render is lines in the woman's faces took like 13 hours to clear up then and I cheated a bit with PS tools to smooth it out. This is no longer the case..

    this is why I reckon the better the overall hardware one has the better and faster renders happen, as you are finding out now and that I hope to be able to do soon..

    Im getting some ideas. In any case 12 to 20 hours seem to be a thing of the past. Even more so when I used to Reality / Lux. Perhaps I will blow the dust of Reality to see what it could do with this rig will see.. Im fine with iray. Requirements I am sure will change when Studio and content advance, will most likely demand more resources. Nothing is forever. So may as well just enjoy...

    Will be interesting to see what Reality / Lux run luck under a system like that.. :)

  • BobvanBobvan Posts: 2,653
    ghosty12 said:
    ghosty12 said:
    Bobvan said:

     Is my CPU not crazy fast as well?

    Comparatively speaking, CPU is glacial compared to GPU rendering.. so it will do in a worst-case scenario... but will be reaaaaaaaaaaaaaaaaaally slow (like hours and hours to render)

    That is because Iray was designed mainly for videocards and not the rest of the system, if Nvidia did a better implementation of hybrid/combined CPU/GPU rendering (not likely to happen) then it might be a different story.. Since when you think about it placing the bulk of the work load on one component if it is not used to it will shorten it's life span quickly, like how they say to never ever completely fill a SSD as it severly shortens its lifespan..

    I'm not sure how much nVidia could do - a CPU has far fewer cores than a GPU, although they can do more, so for tasks that fit with the way a GPU works the GPU will always be faster given well-developed code.

    I suppose more of a case of having a better implementation of Iray being able to use the systems main memory more effectively than having to rely solely on the graphic cards memory for GPU rendering.. So while the GPU can do the work with some help from the CPU the GPU can gain full advantage of all available memory in the system to allow people to have more complex scenes..

    Bobvan said:
    ghosty12 said:
    Bobvan said:

    My SSD sits mostly empty. I keep this machine petty bare bones and put all the stuff on my ROG laptop.. My previous system ran for 4 years just had to replace 1 minor fan.. A few thousand renders.. Another example of the attached render is lines in the woman's faces took like 13 hours to clear up then and I cheated a bit with PS tools to smooth it out. This is no longer the case..

    this is why I reckon the better the overall hardware one has the better and faster renders happen, as you are finding out now and that I hope to be able to do soon..

    Im getting some ideas. In any case 12 to 20 hours seem to be a thing of the past. Even more so when I used to Reality / Lux. Perhaps I will blow the dust of Reality to see what it could do with this rig will see.. Im fine with iray. Requirements I am sure will change when Studio and content advance, will most likely demand more resources. Nothing is forever. So may as well just enjoy...

    Will be interesting to see what Reality / Lux run luck under a system like that.. :)

    Im still on the fence due to all the glitches and workarounds. I dont miss them..

     

  • Ghosty12Ghosty12 Posts: 2,080
    edited May 2017
    Bobvan said:
    ghosty12 said:
    ghosty12 said:
    Bobvan said:

     Is my CPU not crazy fast as well?

    Comparatively speaking, CPU is glacial compared to GPU rendering.. so it will do in a worst-case scenario... but will be reaaaaaaaaaaaaaaaaaally slow (like hours and hours to render)

    That is because Iray was designed mainly for videocards and not the rest of the system, if Nvidia did a better implementation of hybrid/combined CPU/GPU rendering (not likely to happen) then it might be a different story.. Since when you think about it placing the bulk of the work load on one component if it is not used to it will shorten it's life span quickly, like how they say to never ever completely fill a SSD as it severly shortens its lifespan..

    I'm not sure how much nVidia could do - a CPU has far fewer cores than a GPU, although they can do more, so for tasks that fit with the way a GPU works the GPU will always be faster given well-developed code.

    I suppose more of a case of having a better implementation of Iray being able to use the systems main memory more effectively than having to rely solely on the graphic cards memory for GPU rendering.. So while the GPU can do the work with some help from the CPU the GPU can gain full advantage of all available memory in the system to allow people to have more complex scenes..

    Bobvan said:
    ghosty12 said:
    Bobvan said:

    My SSD sits mostly empty. I keep this machine petty bare bones and put all the stuff on my ROG laptop.. My previous system ran for 4 years just had to replace 1 minor fan.. A few thousand renders.. Another example of the attached render is lines in the woman's faces took like 13 hours to clear up then and I cheated a bit with PS tools to smooth it out. This is no longer the case..

    this is why I reckon the better the overall hardware one has the better and faster renders happen, as you are finding out now and that I hope to be able to do soon..

    Im getting some ideas. In any case 12 to 20 hours seem to be a thing of the past. Even more so when I used to Reality / Lux. Perhaps I will blow the dust of Reality to see what it could do with this rig will see.. Im fine with iray. Requirements I am sure will change when Studio and content advance, will most likely demand more resources. Nothing is forever. So may as well just enjoy...

    Will be interesting to see what Reality / Lux run luck under a system like that.. :)

    Im still on the fence due to all the glitches and workarounds. I dont miss them..

     

    Just reading my post that you quoted and I am trying to work out why I wrote luck in my reply as it does not make on bit of sense.. lol  But about Reality/Lux unfortunately it is buggy which is a pain as they are pretty good in what they do.. :(

    Post edited by Ghosty12 on
  • nicsttnicstt Posts: 11,715
    edited May 2017
     

     

    nicstt said:
    kyoto kid said:
    Bobvan said:

    It has win 10.. was not planning on changing I always liked it on my 2 machines and runs better then my old 7 laptop..

    ...W10 home is a PITA when it comes to updating and GPU memory reserving.  Again under W10 that 1080 TI will only allow you to use about 8.6 GB for rendering because of the VRAM the OS reserves,  Get an OEM of W7 Pro as W7 has almost  a negligible footprint on a GPU card's VRAM.  If you have scenes that exceed the available VRAM, Iray will dump to slower CPU mode and all those CUDA cores wil be worthless.

    I've posted in other threads that i don't usually have RAM reserved on the 980ti that isn't used for video but only rendering.

    Well, 3MB is reserved, which isn't really an issue.

    I use W10 pro.

    Opening up Studio (4.9.3.166 beta) ups it from 3MB to 50ish MB

    That's interresting. I guess you have an other GFX card used for the display. Can you detail your hardware and software setup as well as the driver versions used?

     

    I do, I use a 970 for display.

    I have w10 locked down tight, loads of stuff disabled; I check for updates (for example) when my AV suite tells me there are some, if I don't do anyway.

    I also have Nvidia Control Panel option Compute option turned on, although this doesn't seem to make a difference.

    I'm using the latest version of W10, but I had windows performing the same before the upgrade. If all goes well, in a few days I'll update my disk image to the new version.

    Nvidia Driver: 378.78

    Windows 10 Pro Version 1703, build: 15063.296

    i7 4770k (Haswell), 16MB RAM, Gigabyte Z87X UD5H

    Edit:

    added image

    gpuz.jpg
    1216 x 790 - 161K
    Post edited by nicstt on
  • BobvanBobvan Posts: 2,653
    edited May 2017

    Im looking back at some of my lux stuff using R4 and really liked the results so I may give a it a try.. Will keep you posted if / when I do..

    Post edited by Bobvan on
  • MistaraMistara Posts: 38,675
    Bobvan said:

    Custom made at NCIX computers asus board extra cooling high end parts

    found the usa site

    http://www.ncixus.com/go/?ncix-pc#start&CFID=25329&CFTOKEN=87B9FBE4-8D02-4608-988D5B8CB88D18DD

     

  • BobvanBobvan Posts: 2,653

    Cool mine is local 5 mins from my place..

  • MistaraMistara Posts: 38,675

    is nice shops like ncixus still exist.

    my carrara rendering needs are lil different,
    i don't need to spend money on uber graphics card, internet wifi and bluetooths,
    cpu speed and ram is what i'd rather spend my savings, and the cooling.  

    I'm scared to build myself cause of the goop on the cpu for the heatsink.  
    doing it wrong = 2k$ chip up in smoke

Sign In or Register to comment.