What new Laptop for 3D Modelling, Rendering, Graphics and CAD?

2»

Comments

  • BattleMatrixBattleMatrix Posts: 3
    edited March 2019

    creativecortex_354c96ff23,

    Good Luck in working/finding the laptop(custom or pre-built) you need to get you work done. It's good to see the various options in this thread in picking out what you need/want and some various pricing points that you can use a guide.

    I personally isn't the type that advise for or against just because of personal expieriences. I know that "to each their own" due to the fact we all can have good or bad work relations dealing with hardware/software/people in general. And its is left up to the individual, if possible, to decide what is BEST for him or her.

    At the end of this all its all about what YOU WANT. As long as you can get the product that suits your needs, it's a good place to be.  

    Let us know what is your final decision :) It's good to know what people like to buy or stick to when getting a new laptop/desktop.

    Post edited by BattleMatrix on
  • Takeo.KenseiTakeo.Kensei Posts: 1,303
    edited March 2019

    Thanks, I'm appreciating all the perspectives, some things I would not have thought of. I tend to use my laptop as my main machine, work at work, take it home, work at home. I'm doing it now on my current 17", not sure how I would feel about shrinking it down for creative work but I'll have a think. I'm not rendering toy story 3, just visuals of products.

    If you're more keen on a 17" laptop you can do it the other way

    Get the other ASUS https://www.laptopsdirect.co.uk/asus-rog-core-i7-8750h-16gb-1tb-256gb-ssd-gtx1070-8gb-17.3-inch-gfull-hd-1-gl703gs-ee071t/version.asp

    and if you need it, switch the SSD for a bigger like this one https://www.laptopsdirect.co.uk/crucial-mx500-1tb-m.2-ssd-ct1000mx500ssd4/version.asp

    You may have to clone the original OS or install Windows and get back the Windows Licence but that shouldn't be a big deal

    * Edited to link to a M2 SSD

    Post edited by Takeo.Kensei on
  • ebergerlyebergerly Posts: 3,255
    edited March 2019
     

    One is your storage. If at all possible, I would swap out that HDD. SSDs would create a lot less heat as well as significantly less moving parts. I would bump the SSD to 1TB even if that meant foregoing the HDD all together.

    Heat dissipation is going to be your biggest issue. Make sure the rig has a decent number of fans, particularly a dedicated GPU fan. Most of the "gaming" laptops will have this.

    I suppose we could get into a long science project comparing actual power usage and heat differences between HDD’s and SSD’s, but at the end of the day I think it’s pretty much irrelevant.

    Keep in mind we’re talking about a laptop that’s using a CPU that draws maybe 30-50 watts, and a GPU that draws something like 80-115watts (for a mobile GTX1070, according to NVIDIA). And the power differences between SSD and HDD might practically be only in the range of something like 5 watts. Does anyone really care about something that minor?

    That entire power usage difference can be erased by running the GPU a little longer during a render. Or just by manufacturing tolerances in the GPU components. Maybe the GPU is out of spec, and is actually drawing 85-120 watts...are you going to return it and find one that’s only 80-115? And the heating effect of 5 watts on your laptop is as negligible as having your room temperature 1 degree higher or something like that. Does anyone really frantically shut down their renders as fast as possible, or lower your room temperature as much as possible, just to cut down a tiny bit of power consumption and/or heat generation? Do you tweak and optimize your OS settings to save 5 watts of power?

    SSD’s are nice if you want fast response for some OS and other stuff. If SSD’s can help a lot with your video editing speed and that’s important to you, then fine, get an SSD. But if you don’t care about that, then HDD’s are fine, just like they’ve been fine for the last many decades.

    Just because something might be “better” in some aspects, doesn’t mean that’s important or even relevant to any individual user. IMO, instead of worrying about intricate details of equipment specs, it’s far more useful to actually monitor your equipment as you use it, and note any trends that show your equipment might be starting to overheat. And make sure you’ve cleaned the dust bunnies out, and that the fans are working and not blocked, etc.    

    Post edited by ebergerly on
  • Jason GalterioJason Galterio Posts: 2,562
    ebergerly said:
     

    One is your storage. If at all possible, I would swap out that HDD. SSDs would create a lot less heat as well as significantly less moving parts. I would bump the SSD to 1TB even if that meant foregoing the HDD all together.

    Heat dissipation is going to be your biggest issue. Make sure the rig has a decent number of fans, particularly a dedicated GPU fan. Most of the "gaming" laptops will have this.

    I suppose we could get into a long science project comparing actual power usage and heat differences between HDD’s and SSD’s, but at the end of the day I think it’s pretty much irrelevant.

    Keep in mind we’re talking about a laptop that’s using a CPU that draws maybe 30-50 watts, and a GPU that draws something like 80-115watts (for a mobile GTX1070, according to NVIDIA). And the power differences between SSD and HDD might practically be only in the range of something like 5 watts. Does anyone really care about something that minor?

    That entire power usage difference can be erased by running the GPU a little longer during a render. Or just by manufacturing tolerances in the GPU components. Maybe the GPU is out of spec, and is actually drawing 85-120 watts...are you going to return it and find one that’s only 80-115? And the heating effect of 5 watts on your laptop is as negligible as having your room temperature 1 degree higher or something like that. Does anyone really frantically shut down their renders as fast as possible, or lower your room temperature as much as possible, just to cut down a tiny bit of power consumption and/or heat generation? Do you tweak and optimize your OS settings to save 5 watts of power?

    SSD’s are nice if you want fast response for some OS and other stuff. If SSD’s can help a lot with your video editing speed and that’s important to you, then fine, get an SSD. But if you don’t care about that, then HDD’s are fine, just like they’ve been fine for the last many decades.

    Just because something might be “better” in some aspects, doesn’t mean that’s important or even relevant to any individual user. IMO, instead of worrying about intricate details of equipment specs, it’s far more useful to actually monitor your equipment as you use it, and note any trends that show your equipment might be starting to overheat. And make sure you’ve cleaned the dust bunnies out, and that the fans are working and not blocked, etc.    

    I am not entirely sure what you were getting at here but, admittedly, I may have been overstating when I stated that there would be a lot less heat generated by a SSD.

    That being said, my laptop has a GeForce GTX 980M, which was one of the highest you could get in a laptop at that time. The thing draws 120W all by itself. When rendering with DS and pounding on the GPU continously, unlike what a typical game would do, it generates a lot of heat. I can't imagine that more recent GPUs have improved on this tendancy that much.

    The preference for SSD was three fold:
    1. Less thrashing of moving parts may increase the overall life of the laptop.
    2. My DS library takes up 1.3TBs. When I moved it to SSDs I found a considerable uptick in response time within DS.
    3. Personal preference from being on a trip when my HDD decided that it had been on too many trips and decided to collapse.

    I am not really sure if that answers your questions.

  • ebergerlyebergerly Posts: 3,255
    edited March 2019
     

    That being said, my laptop has a GeForce GTX 980M, which was one of the highest you could get in a laptop at that time. The thing draws 120W all by itself. When rendering with DS and pounding on the GPU continously, unlike what a typical game would do, it generates a lot of heat. I can't imagine that more recent GPUs have improved on this tendancy that much.

    My point has always been that saying stuff like "pounding continuously" and "a lot of heat" might sound important and meaningful, but in practice are somewhat meaningless. It assumes that GPU's and other components are only designed for less than 100% utilization. That's just not factually correct. They are designed to operate within a specified temperature range, and below any damaging temperatures. And anyone who has monitored their GPU's while doing continuous rendering knows that if the fans are working as designed the GPU components will stay well below any damaging temperatures, even if they run continuously 24/7. They're designed to protect themselves. The BIOS and other controls make sure that the temperature stays in the safe range. The only time you might have a problem is if the cooling system can't or doesn't operate as designed, and you get into some sort of "thermal runaway". 

    For some reason many don't like to believe that, and prefer to believe that they'll melt if you "pound them continuously". They won't. Unless of course you are doing something dumb to operate them outside of their design range (like overclocking, or blocking the cooling with dust, etc.). Or if you bought a piece of junk in the first place. 

    I just encourage people to step back and look at all of this in context, and not fall victim to all of the pervasive tech paranoia. Look at the real numbers and put them in context. Less power draw by an SSD, while technically true, is pretty much irrelevant. Power supply efficiency, while technically a factor, is pretty much irrelevant. "Continuous pounding" of your electronic components is usually irrelevant unless you're doing something wrong. The list goes on and on.   

     

    Post edited by ebergerly on
  • Jason GalterioJason Galterio Posts: 2,562

    I still fail to understand your point...

    I've had laptops have a "thermal event" and shut down. Not my current laptop, but previous laptops while using DS. And this was due to the render process. It had nothing to do with dust or overclocking. It was simply because the ventilation on that laptop was not accustomed to the GPU under constant load. That's why I moved to a "gaming" laptop that would be designed for that kind of use.

    My current laptop is equipped to down throttle the GPU if the voltage is insufficient (like with the incorrect power brick that the OEM tried to convince me was sufficient) as well as when the GPU comes to a certain temperature. Those are facts.

    That said, I really don't understand the issue you have with my previous messages, so I will just back away from this thread now.

     

  • ebergerlyebergerly Posts: 3,255
    edited March 2019

    My current laptop is equipped to down throttle the GPU if the voltage is insufficient (like with the incorrect power brick that the OEM tried to convince me was sufficient) as well as when the GPU comes to a certain temperature. Those are facts.

    Yes, those are facts. And virtually every CPU and GPU has the same or similar protective features implemented in the BIOS and drivers, like I said. Engineers include "fan curves" and speed/frequency throttling for that very purpose, to make sure the components don't get damaged if something goes wrong. But that doesn't mean that it WILL go wrong every time you render. Power supplies usually include protective devices to shut off power if the PS gets overloaded. But that doesn't mean your power supply will get overloaded. I've had many computers over decades and never had a PS shut down. It's there just in case. 

    BTW, of course there are exceptions. If you buy a cheap piece of junk that was poorly designed, then you're more likely to have problems. Or maybe you buy a device that, by chance, has a bad component. Or maybe you did something unwittingly to cause a problem. Or maybe it was a quality company that had a bad design for whatever reason. 

    But those are exceptions, and not reflective of what you can usually expect.   

    Post edited by ebergerly on
  • edited March 2019

    Firstly, thanks for all your input folks. I know it's a bit wishy washy watching someone else make a purchasing decision, and you've all been a great help.

    On balance I think I'm likely to settle here (but only if I call them and knock some off the price - they don't include sales tax in their prices so it could be an expensive Thunderbolt port otherwise with the rest of the spec the same):

    https://www.dell.com/en-uk/work/shop/laptops/dell-g3-17-gaming-laptop/spd/g-series-17-3779-laptop/bn37720 

    Guts are nearly identical to the Asus, but this one comes with a thunderbolt port allowing me to connect more things. So I might chuck in a couple of displays.

    I'll order on Monday, last chance to tell me I'm a fool :-D 

    Post edited by creativecortex_354c96ff23 on
  • ebergerlyebergerly Posts: 3,255

    Nice machine. In fact I've had pretty much a desktop version of that for a few years (a Dell with i7, 16GB RAM, and a GTX 1060) as one of my backup computers and it's great. And like most of the big names, Dell's are generally well designed, so as long as you take care of it you should be fine. Personally I'd download some hardware monitoring software (like CPUID HWMonitor, and GPU-Z, etc.) and check temps occasionally. It's just good practice, IMO, though I can't recall ever having a thermal problem in all my years using computers. And of course just make sure you keep the dust bunnies out. 

  • DustRiderDustRider Posts: 2,880

    I use a laptop for almost all my work, and your last system looks like it should serve your needs well based on what you have been using. My only real concern is looks like a "thin" laptop in the photos. If it is, they do tend to be a bit less efficient for heat dissipation under heavy use when rendering (thin notebooks don't have as much room for heat sinks). I've always avoided thin laptops for this reason (and they have more limited drive capacities). Maybe the thin notebooks have improved in heat dissipation in the last 2-3 years?? 

  • ebergerlyebergerly Posts: 3,255
    edited March 2019
    DustRider said:

    I use a laptop for almost all my work, and your last system looks like it should serve your needs well based on what you have been using. My only real concern is looks like a "thin" laptop in the photos. If it is, they do tend to be a bit less efficient for heat dissipation under heavy use when rendering (thin notebooks don't have as much room for heat sinks). I've always avoided thin laptops for this reason (and they have more limited drive capacities). Maybe the thin notebooks have improved in heat dissipation in the last 2-3 years?? 

    There seems to be a fairly pervasive belief in the tech community that engineers who design computers just sit around with a 6-pack of beers and toss a coin to decide what stuff they'll throw into their latest laptop. smiley

    In the real world they check the specs of the devices they're planning to use, design a cooling system for the particular enclosure, test it in the lab, design fan curves and other controls to maintain proper temperatures when it's operating, and so on. 

    Aside from that, in recent years there has been a strong push for engineers to develop new components that are higher efficiency (ie, use less power and produce less heat) due to the desire to have smaller and smaller yet more powerful devices. Which is why you can even find a GTX1060 that will work in a laptop. 

    Now of course there are the vendors who sell cheap junk, but for the brand name sellers it's a pretty good guess that their devices will do fine as long as you operate it as designed and don't do something dumb. 

    You can verify this for yourself by taking your laptop, doing a continuous 24/7 render, while measuring the important temperatures to see that they don't stray into dangerous territory. And if they do stray, then instead of immediately blaming the manufacturer or the fact that it's a laptop, first check to make sure your cooling system is working okay and isn't blocked/plugged, your BIOS and drivers are updated (for fan and other cooling system controls), etc. I did that on a certified piece of junk laptop and the temperatures over a 24 hour period with continuous Blender rendering stayed rock solid in the normal range:

    https://www.daz3d.com/forums/discussion/212106/ot-laptop-render-benchmark-results/p1

    Post edited by ebergerly on
  • Takeo.KenseiTakeo.Kensei Posts: 1,303

    Firstly, thanks for all your input folks. I know it's a bit wishy washy watching someone else make a purchasing decision, and you've all been a great help.

    On balance I think I'm likely to settle here (but only if I call them and knock some off the price - they don't include sales tax in their prices so it could be an expensive Thunderbolt port otherwise with the rest of the spec the same):

    https://www.dell.com/en-uk/work/shop/laptops/dell-g3-17-gaming-laptop/spd/g-series-17-3779-laptop/bn37720 

    Guts are nearly identical to the Asus, but this one comes with a thunderbolt port allowing me to connect more things. So I might chuck in a couple of displays.

    I'll order on Monday, last chance to tell me I'm a fool :-D 

    You're a fool....you may cause the end of the world by doing that.

    But I think you made the right choice for your need. I don't think there are cheaper 17" laptop with thunderbolt 3 and that may be good to have that in the future, and I'm not thinking about displays.

     

    ebergerly said:
    DustRider said:

    I use a laptop for almost all my work, and your last system looks like it should serve your needs well based on what you have been using. My only real concern is looks like a "thin" laptop in the photos. If it is, they do tend to be a bit less efficient for heat dissipation under heavy use when rendering (thin notebooks don't have as much room for heat sinks). I've always avoided thin laptops for this reason (and they have more limited drive capacities). Maybe the thin notebooks have improved in heat dissipation in the last 2-3 years?? 

    There seems to be a fairly pervasive belief in the tech community that engineers who design computers just sit around with a 6-pack of beers and toss a coin to decide what stuff they'll throw into their latest laptop. smiley

    In the real world they check the specs of the devices they're planning to use, design a cooling system for the particular enclosure, test it in the lab, design fan curves and other controls to maintain proper temperatures when it's operating, and so on. 

    Aside from that, in recent years there has been a strong push for engineers to develop new components that are higher efficiency (ie, use less power and produce less heat) due to the desire to have smaller and smaller yet more powerful devices. Which is why you can even find a GTX1060 that will work in a laptop. 

    Now of course there are the vendors who sell cheap junk, but for the brand name sellers it's a pretty good guess that their devices will do fine as long as you operate it as designed and don't do something dumb. 

    You can verify this for yourself by taking your laptop, doing a continuous 24/7 render, while measuring the important temperatures to see that they don't stray into dangerous territory. And if they do stray, then instead of immediately blaming the manufacturer or the fact that it's a laptop, first check to make sure your cooling system is working okay and isn't blocked/plugged, your BIOS and drivers are updated (for fan and other cooling system controls), etc. I did that on a certified piece of junk laptop and the temperatures over a 24 hour period with continuous Blender rendering stayed rock solid in the normal range:

    https://www.daz3d.com/forums/discussion/212106/ot-laptop-render-benchmark-results/p1

    I don't think there have been that much progress. It's not really like you think.

    First, I didn't go through all your linked thread but there is a big difference between a core i3 and and a i7 with 6 cores.

    And even if progress has been made in the dissipation field, you have to pay the price to get it. For low priced gaming laptop, you have to sacrifice something

    And I guess one of these will be thermal dissipation capacity

    The consequence of that will be that the GPU and CPU will both throttle a lot

    Second consequence is that there is no way any of those will be used at full capacity. First, because you don't afford enough electrical power to get them running at full speed.

    And also because these component that generate the most heat are not allowed to generate more heat than a certain limit.

    So in the end you don't need to get a better thermal dissipation. You just limit the maximal heat generated

     

    Little trick which is a direct consequence of all these : it seems that some people are undervolting their CPU and/or GPU in order to get a bit of better performance. The logic is that by undervolting, you generate less heat and that it somehow prevent too much throttling.

  • ebergerlyebergerly Posts: 3,255
    edited March 2019

    So are you saying that the OP's new laptop will "throttle a lot" and "there is no way any of those will be used at full capacity"? Do you have any actual data to support any of that? Sounds like you're implying that an i7 with 6 cores in a laptop is going to overheat and/or throttle. So do the designers just throw an i7 into a laptop knowing full well it will throttle under full utilization? 

    Post edited by ebergerly on
  • Takeo.KenseiTakeo.Kensei Posts: 1,303

    Sorry but, I'm not going to search any data because that's time consuming and I'm not sure anybody has collected such datas. Anyway, Google is your friend.

    And a bit of logic can help. Since Pascal, Nvidia's GPU are almost the same as their desktop counterpart. Since you can already throttle with the desktop versions which have a way better dissipation system, there is no way a notebook version will not. If some magical dissipation system existed, you'd see a modified version for desktop instead of having systems with triple fans, and people wouldn't go through the hassle of mounting watercooled systems.

  • nicsttnicstt Posts: 11,715

    A laptop, except a stupidly expensive one, is never going to be a satisfactory experience for your uses.

    You could spend $5k or $6k on a laptop or you could get a laptop for much less than half that that would perform substantially better.

    All fair points. Unfortunately it has to be portable because I need to work from different locations and can't lug a desktop around.

    Keep laptop for lugging.

    Buy a desktop for when lugging not needed?

    As stated, Laptops offer poor value in comparrison to a desktop for such tasks as rendering; if it must only be a laptop, 16GB RAM (upgrade if you can); see about using external graphics cards.

    I have a laptop for work; it's a powerful machine, but I work on my own machine when at home - it leaves the laptop in the dust. I prefer the setup - far more comfortable with better monitors, keyboard (mechanical) and a mouse I like. (Although work bought me the same keyboard as I bought for myself at home.)

  • ebergerlyebergerly Posts: 3,255
    edited March 2019

    Since Pascal, Nvidia's GPU are almost the same as their desktop counterpart.

    No, they're not. The mobile versions run at substantially lower clock speeds, and, as a result, power dissipation. Google is your friend smiley 

    For example, a laptop GTX 1070 has, as I posted previously, a power consumption of 80-115 watts, the desktop version has a power consumption spec of 150 watts.

    Post edited by ebergerly on
  • ebergerlyebergerly Posts: 3,255
    nicstt said:

    I have a laptop for work; it's a powerful machine, but I work on my own machine when at home - it leaves the laptop in the dust. 

    Not sure exactly what you mean by "leaves the laptop in the dust", but I did a quick Google and found that at least in Blender a benchmark scene renders in 135 seconds on a GTX960M (mobile version), and in 102 seconds on a desktop version. Which means the desktop cuts the render time by 25%. Significant, but a bit less than the typical 35% improvement (or thereabouts) you get from going to a new model NVIDIA GPU (like from a 1060 to a 1070) in Iray. 

     

  • ebergerlyebergerly Posts: 3,255
    edited March 2019

    Interesting...

    At this link I found a head-to-head comparison of the GTX-1070 desktop and 1070 laptop versions, and they determined that the average of all their benchmark test results showed that the desktop version was only 12% better than the laptop version. And only 14% better across the gaming tests. 

    https://www.notebookcheck.net/GeForce-GTX-1070-Laptop-vs-GeForce-GTX-1070-Desktop_7364_7323.247598.0.html

    So this popular belief that laptops just can't hack it compared to desktops just doesn't match the facts. And the fact that they can perform this well in spite of dissipating significantly less power shows how the technology is improving, like I said previously. 

    Post edited by ebergerly on
  • ParadigmParadigm Posts: 423
    edited March 2019

    I have an older version of this with the 32GB RAM option and it works great:

    https://www.newegg.com/Product/Product.aspx?Item=N82E16834233237

    Post edited by Paradigm on
  • nicsttnicstt Posts: 11,715
    edited March 2019

    Laptops can match all but the best desktops; that performance comes at a significant cost; this is particularly true when thinking about upgradeability.

    No one here as far as I can tell, stated this 'popular belief'. I did see a lot of talk about compromises. Now one has to compromise with a desktop too, a Laptop's key strength is that it IS portable, and it may take up less space - unless one starts adding additional monitors, docking stations and desktop keyboards. If I tried that with my laptop, I'd have less space (where its most important) as the laptop would need to be one of the three monitors I use at home or at work and would make using a normal keyboard more problematic.

    I like both laptops and desktops; I sure as hell am glad I don't have to carry my desktop arround.

    Edit: I am also really pleased I don't have to buy a laptop with the specs of my desktop; this is presuming it is possible.

    Post edited by nicstt on
  • kenshaw011267kenshaw011267 Posts: 3,805

    There are lots and lots of benchmarks showing that laptops with true desktop grade HW throttle and throttle hard. Even many of the lappies with high end laptop HW throttle. And when you're talking the gaming laptop end of the spectrum the attempts to keep that HW cool usually results in absurdly loud fan noise to the extent that you pretty much have to use headphones to hear anything over the laptop itself.

    Even in the best case scenario to get a decent performance laptop you'll pay 2 or 3 times what am equivalent desktop would and that desktop will have upgrade capabilities that most laptops lack, even the most upgradeable lappies just allow RAM and drive upgrades/expansion. The CPU and GPU you buy with the laptop are the ones you're stuck with.

    I have a lappie provided by my job which is perfectly adequate for the work I need to do on it, 15" 2018 LG Gram. IIRC the retail price was $1800 and it doesn't have a discreet GPU.

  • Thanks again everyone. Glad to see the Thunderbolt option is recommended. However it works out my new machine will leave my old one in the dust somewhat, even if desktops are still a bit further ahead. Many thanks all.

  • Takeo.KenseiTakeo.Kensei Posts: 1,303
    edited March 2019
    ebergerly said:

    Interesting...

    At this link I found a head-to-head comparison of the GTX-1070 desktop and 1070 laptop versions, and they determined that the average of all their benchmark test results showed that the desktop version was only 12% better than the laptop version. And only 14% better across the gaming tests. 

    https://www.notebookcheck.net/GeForce-GTX-1070-Laptop-vs-GeForce-GTX-1070-Desktop_7364_7323.247598.0.html

    So this popular belief that laptops just can't hack it compared to desktops just doesn't match the facts. And the fact that they can perform this well in spite of dissipating significantly less power shows how the technology is improving, like I said previously. 

    I don't see anything of any value to judge a notebook for 3D modeling and rendering in this comparison. Perhaps Specviewperf 12 and Cinebench R15 OpenGL, but I wouldn't try to draw a conclusion when the test were not even done with the same driver. And since benchmark are certainly done without pre-heating the laptop, I would not take the results for accounted

    I'll remind that the context was "low cost" gaming notebook. And you should also have a closer look at the bench. Look at the stress tests (just before emissions paragraph) of the HP Omen and see how the GPU clock significantly drops with Furmark test https://www.notebookcheck.net/HP-Omen-17t-i7-8750H-GTX-1070-Laptop-Review.323450.0.html#toc-emissions

    That's likely what you'll get when rendering. You can also see that with the Witcher, the GPU clock doesn't drop much. That aspect is not represented in your head to head comparison. So, I don't think gaming benchmark are in anyway appropriate to draw a conclusion for 3D rendering. I'd expect at least a 25% speed penalty against desktop GPU.

    What I stated still stands. Without proper cooling system that you may find on expensive systems, there's a good chance that notebook GPU will suffer a big penalty. The situation can get even worse in term of performance with max-Q GPU , which is what the OP bought.

    Now, you can ignore throttling like most review do and think that benchmark results will represent what you'll get, but throttling is a reality.

     

    Thanks again everyone. Glad to see the Thunderbolt option is recommended. However it works out my new machine will leave my old one in the dust somewhat, even if desktops are still a bit further ahead. Many thanks all.

    Happy for you. I didn't bother to stress that the Dell GPU is a Max-Q because I thought it would anyway outperform your former notebook by a large margin. Your new toy should open new horizons of creativity and productivity.

    Post edited by Takeo.Kensei on
  • Thanks everyone. New laptop arrived. Quick benchmark test on a standard Carrara Global Illumination scene.

    Old laptop - 2 minutes 26 seconds

    New laptop - just 57 seconds

    I can live with that.

  • nicsttnicstt Posts: 11,715

    Glad your happy. Certainly the improvement is reasonable..

  • Takeo.KenseiTakeo.Kensei Posts: 1,303

    That's good. I think most of us (me included) didn't realize that you were mostly doing CPU rendering.

    Otherwise we could have checked for a beefier one, but I guess it's too late and the speed gain must already make you happy

  • On Cararra, possibly, but that's not all I'm doing. Other renderers use GPU.

    Interesting point for those following this... on my old Quad Core i5 I would get 4 cores visibly rendering at a time. On this new laptop which is a Hex Core i7, I get 10 cores visibly rendering at a time. So maybe it is calling on he GPU after all.

  • Takeo.KenseiTakeo.Kensei Posts: 1,303
    Strange...you should have 12 cores rendering. Is Carrara limited to 10 cores?
  • kenshaw011267kenshaw011267 Posts: 3,805

    On Cararra, possibly, but that's not all I'm doing. Other renderers use GPU.

    Interesting point for those following this... on my old Quad Core i5 I would get 4 cores visibly rendering at a time. On this new laptop which is a Hex Core i7, I get 10 cores visibly rendering at a time. So maybe it is calling on he GPU after all.

    That i5 was a 4 core 4 thread cpu. So it could render on at most 4 threads at a time. The i7 is 6 cores/12threads so it should be possible to render on 12 threads. Why its only using 10 may be a limitation of the renderer.

Sign In or Register to comment.