28 Core Intel HEDT Processor coming late this year... or not...

tj_1ca9500btj_1ca9500b Posts: 2,047
edited June 2018 in The Commons

Intel has a 28 core HEDT Processor that'll be working it's way into the product pipeline late this year:  They showcased it at Computex.

https://wccftech.com/intel-28-core-x-hedt-cpu-8th-gen-8-core-updates-2018/

It won't be here until around the end of the year, and will probably be pricey, but I'm sure some people will want it...

Edit: added 'or not' to title based on subsequent info.

Post edited by tj_1ca9500b on
«1

Comments

  • One thing this won't make DS render any faster as it's based on GPU and not CPU.. I got a i9 with 20 threads and my old system goes faster because of the 2XGTX980 TI compared to a single GTX1080 TI. So just warning people to put the money into the graphics cards and not the CPU !

  • Ghosty12Ghosty12 Posts: 1,955

    One thing this won't make DS render any faster as it's based on GPU and not CPU.. I got a i9 with 20 threads and my old system goes faster because of the 2XGTX980 TI compared to a single GTX1080 TI. So just warning people to put the money into the graphics cards and not the CPU !

    That all depends if working with 3DL, which some people myself included still do.. Also one other thing iRay does not like SLI for one, and rumor has it that SLI might be on the way out.. Also telling folks not to put thought into the CPU is not a good idea as you still need a half decent CPU to do everything else..

  • nicsttnicstt Posts: 11,714
    edited June 2018

    And if it wont fit in the card's RAM, then your expensive Nvidia offering is useless. Then it's either slower on CPU, or spend TIME optimising; I have a threadripper, so I mostly let the CPU do the render, as time optimising is often time lost.

    Post edited by nicstt on
  • AlienRendersAlienRenders Posts: 789

    Seems Intel is doing the same thing they did when Threadripper came out. AMD had 16 cores and Intel came out with an 18 core CPU. Now that Threadripper 2 will have 24 cores, Intel must top it with 28 cores. The thing is, there is a rumour that AMD will have a Threadripper variant with 32 cores. Supposed to come out in August.

  • OZ-84OZ-84 Posts: 128

    I own a 16Core Threadripper ... 

    As much i like the idea of a reasonable priced 28core cpu, i dont feel it makes any sense for iray renders. I would bet, that if this even was a 50+ core it couldnt compare to a 1080ti. 

    Iray is so much build arround GPU rendering that it really makes no sense to buy something more powerull than a 4 core. 

  • TaozTaoz Posts: 9,714
    edited June 2018

    Assuming other things being equal, how much faster will a 16 core CPU be compared to an 8 core, if the software can utilize 16 cores?

    Post edited by Taoz on
  • Daz Jack TomalinDaz Jack Tomalin Posts: 13,085
    Taoz said:

    Assuming other things being equal, how much faster will a 16 core CPU be compared to an 8 core, if the software can utilize 16 cores?

    Well, by equal if you mean the core speed.. then twice as fast.

    However in reality more cores usually mean lower speeds per core.. but it would certainly be faster I think.

    That new 28 core CPU, what caught my eye is the insane 5Ghz across all the cores.. the cinebench score is better than my 2x22 core xeon rig.. very nice!

  • TaozTaoz Posts: 9,714
    Taoz said:

    Assuming other things being equal, how much faster will a 16 core CPU be compared to an 8 core, if the software can utilize 16 cores?

    Well, by equal if you mean the core speed.. then twice as fast.

    However in reality more cores usually mean lower speeds per core.. but it would certainly be faster I think.

    Yes, I'd definitely expect it to be faster but if it's only like 25% I'm not sure if it's worth it.

    That new 28 core CPU, what caught my eye is the insane 5Ghz across all the cores.. the cinebench score is better than my 2x22 core xeon rig.. very nice!

    Yes, that sounds like it's worth it - depending on the price of course.

  • Daz Jack TomalinDaz Jack Tomalin Posts: 13,085
    edited June 2018
    Taoz said:
    Yes, that sounds like it's worth it - depending on the price of course.

    Oh it'll beinsane.. but then, you always pay a premium for that cutting edge stuff.  But then, lets see what the next gen Xeons do.. because if one is good.. two's better :)

    Post edited by Daz Jack Tomalin on
  • TaozTaoz Posts: 9,714
    Taoz said:
    Yes, that sounds like it's worth it - depending on the price of course.

    Oh it'll beinsane.. but then, you always pay a premium for that cutting edge stuff.  But then, lets see what the next gen Xeons do.. because if one is good.. two's better :)

    Wait a few years, then they'll probably be using them in mobile phones. :)

  • Ghosty12Ghosty12 Posts: 1,955
    edited June 2018

    One other thing everyone has to consider is that now that cryptocurrency mining is the big thing, would not be surprised to see the new GPUs that are due sometime in the second half of this year, skyrocket in price when the miners get a hold of them.. 

    But the one thing is that you want a relatively decent CPU as well as other components so that there is no bottlenecking in the system.. And on a completely different note Intel have put Optane components into a DD4 formfactor PCB to slide right into DD4 ram slots and supposed sizes are 256gig and 512gig in size..

    Taoz said:
    Yes, that sounds like it's worth it - depending on the price of course.

    Oh it'll beinsane.. but then, you always pay a premium for that cutting edge stuff.  But then, lets see what the next gen Xeons do.. because if one is good.. two's better :)


    If you watch Linus Tech Tips videos on youtube he has done some reviews on some of the latest Xeons out there, he says that one of his main gripes with them are the heatsinks being a PITA to install..

    Post edited by Ghosty12 on
  • WendyLuvsCatzWendyLuvsCatz Posts: 37,712
    edited June 2018

    it depends on the clock speed too

    I like the idea of all those render tiles in Carrara but if it is no faster rather pointless 

    I know for example going from i5 to i7 doubling my threads it was no faster

    Post edited by WendyLuvsCatz on
  • Daz Jack TomalinDaz Jack Tomalin Posts: 13,085

    I missed the release of the Xeon Platinum 8180 - that's the same 28 cores but at 2,5ghz.. and retails around $10k... so yea, this is looking to be somewhat of a publicity stunt to me. 

    Let alone how much heat that must be dumping out...

  • Ghosty12Ghosty12 Posts: 1,955

    Came across this video from Linus Tech Tips, showing what he reckoned was a 28 Core processor.. The interesting thing that the board used for this "classified" CPU looks to be the same as some of the newer model Xeon boards.. As if you look up in the left corner area of the board you can see a large empty area, where on the WS Sage boards the second Xeon CPU would go..

  • MistaraMistara Posts: 38,675

    28 cores?!!

  • tj_1ca9500btj_1ca9500b Posts: 2,047

    Yeah, the more I learn about this 28 core Intel processor, the less impressed I am with it, and the more it reads like a publicity stunt to distract us from more 'realistic' options.  I'd imagine that AMD could get the upcoming 32 core Threadripper to 5GHz with unconventional cooling as well if they wanted to.  Heck, with LN2 the Ryzen 2700X has been pushed to 6 GHz already, so by comparison, 5 GHz with Threadripper with exotic cooling methods doesn't seem that far fetched.

    The 28 core Intel production model will likely be much slower than 5 GHz, or be so expensive that only a very tiny fraction of the market will even consider it, especially if there are more affordable and more 'conventional' options out there.  But, it's nice to see both companies pushing the core counts at least.

    For 3Delight and Blender users, yeah more cores good!

  • ebergerlyebergerly Posts: 3,255

    For 3Delight and Blender users, yeah more cores good!

    Blender uses the GPU I believe. Either NVIDIA or AMD.

    Honestly, I can't understand the need for a 32 core CPU, other than maybe a few remaining apps. My Ryzen 8 core, 16 thread CPU sits pretty much idle 99% of the time, and certainly never uses all threads (at least intentionally). Heck, when you have virtually thousands of threads in a GPU, a CPU with 64 threads doesn't seem too impressive (to me at least). I'd rather spend my $$ on GPU's, and buy a low end CPU. But I suppose AMD and Intel have to stir up excitement for the next new thing.  

  • kyoto kidkyoto kid Posts: 40,515

    Yeah, the more I learn about this 28 core Intel processor, the less impressed I am with it, and the more it reads like a publicity stunt to distract us from more 'realistic' options.  I'd imagine that AMD could get the upcoming 32 core Threadripper to 5GHz with unconventional cooling as well if they wanted to.  Heck, with LN2 the Ryzen 2700X has been pushed to 6 GHz already, so by comparison, 5 GHz with Threadripper with exotic cooling methods doesn't seem that far fetched.

    The 28 core Intel production model will likely be much slower than 5 GHz, or be so expensive that only a very tiny fraction of the market will even consider it, especially if there are more affordable and more 'conventional' options out there.  But, it's nice to see both companies pushing the core counts at least.

    For 3Delight and Blender users, yeah more cores good!

    ...Carrara as well. 

    Also as mentioned if in Daz Iray your scene exceeds your VRAM and it dumps to a low core/thread count CPU, time to go do the laundry, mow the lawn, and wash the car.

  • nonesuch00nonesuch00 Posts: 17,890

    Those that make animations could definately do with 32, 64, 128, 256 cores...at some point they'd reach a core core needed to do 16K 3 hour animations in a reasonable amount of time. I could be wrong though in guessing that TV monitor or theatre screen size would never exceed 16K resolution. I don't think movie theatre don't use digital screen anyway but digital projectors instead.

  • kyoto kidkyoto kid Posts: 40,515

    ..a little more on this beast from Tom's.

    Yeah that is an external cooling unit that by itself consumes about 1.7 KW on top of the 1 KW for the CPU all drawing 20 amps.  The electric bill alone to run this would be staggering.

    https://www.tomshardware.com/news/intel-28-core-processor-5ghz-motherboard,37213.html

  • Hurdy3DHurdy3D Posts: 1,037
    ghosty12 said:

     

    That all depends if working with 3DL, which some people myself included still do.. Also one other thing iRay does not like SLI for one, and rumor has it that SLI might be on the way out.. Also telling folks not to put thought into the CPU is not a good idea as you still need a half decent CPU to do everything else..

    Never heard of that SLI is on the way out or nividias iray. Does this mean that it is not a good idea to buy two geforce cards because iray will no longer support multiple  geforce cars in the future? Are there some sources for this rumors?

  • Ghosty12Ghosty12 Posts: 1,955
    edited June 2018
    gerster said:
    ghosty12 said:

     

    That all depends if working with 3DL, which some people myself included still do.. Also one other thing iRay does not like SLI for one, and rumor has it that SLI might be on the way out.. Also telling folks not to put thought into the CPU is not a good idea as you still need a half decent CPU to do everything else..

    Never heard of that SLI is on the way out or nividias iray. Does this mean that it is not a good idea to buy two geforce cards because iray will no longer support multiple  geforce cars in the future? Are there some sources for this rumors?

    It is true that Nvidia are doing away with SLI to a point.. That point being 3 and 4 way SLI has been quietly done away with, and Nvidia are only supporting 2 way SLI..

    My understanding of it all is that while SLI was great for those that had the money for multiple GPU's there were not that many applications and games (games mainly) that had SLI support.. And that anymore than two GPU's was really overkill in a lot of cases.. And to those of us using iRay, from my understanding iRay can't won't work with SLI enabled.. You would think that Nvidia would of added SLI support fo iRay since they are the ones that created iRay..

    The article below is old so has been known about for a while..

    https://www.pcworld.com/article/3082708/components-graphics/nvidia-quietly-kills-3-and-4-way-sli-support-for-geforce-gtx-10-series-graphics-cards.html

    https://www.guru3d.com/news_story/nvidia_drops_3_and_4_way_sli_mode_starting_with_geforce_1000_series.html

    Post edited by Ghosty12 on
  • kyoto kidkyoto kid Posts: 40,515
    edited June 2018

    ...that would be a pain as well as cut into their sales.  The whole idea behind multiple cards in gaming rigs is improved frame rate, and for CG rendering, shorter render times.

    Maybe, despite my thoughts, they are going to move to NVLink, though at 900$ per pair of widgets to take full advantage, it would be a hard sell as well as could again bring GTX cards into direct competition with their more expensive pro grade ones (imagine having say two 12 GB 1180s linked so their VRAM would pool giving you 24 GB to work with).

    Then again Apple is talking about ditching OpenGL and OpenCL.

    Post edited by kyoto kid on
  • Ghosty12Ghosty12 Posts: 1,955

    Well after some more digging around found that you can still have 3 and 4 way SLI with the Pascal cards but, and a big but at that Nvidia are not officially supporting it, you will need special LED-Lit SLI bridges to do so and having to download a special enthusiast key to get it to work.. And even  then there is no guarantee that it will all work..

    https://www.pcworld.com/article/3071332/hardware/its-true-nvidias-geforce-gtx-1080-officially-supports-only-2-way-sli-setups.html

  • Daz Jack TomalinDaz Jack Tomalin Posts: 13,085
    edited June 2018

    You don’t need SLI to use multiple GPU’s in Iray. 

    In mine, I do have two cards SLI’d when I want to play a game in 4k (on the rare occasion I get the chance) but otherwise it’s disabled and in Iray uses all 4 quite happily.

    Post edited by Daz Jack Tomalin on
  • kyoto kidkyoto kid Posts: 40,515

    ...actually as I have come to understand running in SLI while rendering does little if anything to enhance performance, and I believe Nvidia mentions it is best to disable it for that purpose.

  • nonesuch00nonesuch00 Posts: 17,890
    kyoto kid said:

    ...that would be a pain as well as cut into their sales.  The whole idea behind multiple cards in gaming rigs is improved frame rate, and for CG rendering, shorter render times.

    Maybe, despite my thoughts, they are going to move to NVLink, though at 900$ per pair of widgets to take full advantage, it would be a hard sell as well as could again bring GTX cards into direct competition with their more expensive pro grade ones (imagine having say two 12 GB 1180s linked so their VRAM would pool giving you 24 GB to work with).

    Then again Apple is talking about ditching OpenGL and OpenCL.

    LOL, Apple is trying to ditch openGL & openCL I guess that are loath to release all those cross platform games Unity, UE4, and the like have coming at them. It levels the playing field with all sorts of cheap $50 Android devices. They did start their Metal specifically to try and differenciate themselves from Android and Windows devices more strongly but Unity & the like have already added Metal support. So since the game engine makers are commited to cross-platform support for Apple devices, Apple's decision pushes them farther and farther from ever competeing again with Microsoft in the desktop computer graphics production tools market. Not they they have anyway in a long time.

  • nicsttnicstt Posts: 11,714
    OZ-84 said:

    I own a 16Core Threadripper ... 

    As much i like the idea of a reasonable priced 28core cpu, i dont feel it makes any sense for iray renders. I would bet, that if this even was a 50+ core it couldnt compare to a 1080ti. 

    Iray is so much build arround GPU rendering that it really makes no sense to buy something more powerull than a 4 core. 

    I waited ages before going above 4 core; i regret not taking the plunge earlier (other than I'm glad i didn't support the rip-off prices Intel charged before they had competition again).

    16 cores, is what I have now; I am considering the 32 core, and if I can just drop it in, then it's a certainty.

     

    kyoto kid said:

    ...actually as I have come to understand running in SLI while rendering does little if anything to enhance performance, and I believe Nvidia mentions it is best to disable it for that purpose.

    SLI (at best) will not impact renders in any way at all; it is likely to negatively affect rendering performance; as you say, this is from Nvidia.

  • kyoto kidkyoto kid Posts: 40,515
    kyoto kid said:

    ...that would be a pain as well as cut into their sales.  The whole idea behind multiple cards in gaming rigs is improved frame rate, and for CG rendering, shorter render times.

    Maybe, despite my thoughts, they are going to move to NVLink, though at 900$ per pair of widgets to take full advantage, it would be a hard sell as well as could again bring GTX cards into direct competition with their more expensive pro grade ones (imagine having say two 12 GB 1180s linked so their VRAM would pool giving you 24 GB to work with).

    Then again Apple is talking about ditching OpenGL and OpenCL.

    LOL, Apple is trying to ditch openGL & openCL I guess that are loath to release all those cross platform games Unity, UE4, and the like have coming at them. It levels the playing field with all sorts of cheap $50 Android devices. They did start their Metal specifically to try and differenciate themselves from Android and Windows devices more strongly but Unity & the like have already added Metal support. So since the game engine makers are commited to cross-platform support for Apple devices, Apple's decision pushes them farther and farther from ever competeing again with Microsoft in the desktop computer graphics production tools market. Not they they have anyway in a long time.

    ...yeah back in the days of Pixel Paint and the early Photoshop Mac was the way to go for creative artists. Not any more.

  • nicsttnicstt Posts: 11,714

    I see Macs as fashion accessories; I used to love them for their innovation, and going their own way. As Kyoto kid said, "Not any more."

Sign In or Register to comment.