OT: Ryzen is coming out soon!

13

Comments

  • Takeo.KenseiTakeo.Kensei Posts: 1,303

    @kyoto kid: OpenCL runs on anything if you have the drivers... CPU and video cards from any vendor. Only thing is that nVidia only has drivers for an older version of OpenCL, but it could work.

    Luxrender works on nvidia GPU without problem. And for old OPENCL version it seems that it will change https://streamcomputing.eu/blog/2017-02-22/nvidia-enables-opencl-2-0-beta-support/

    However I'm not sure Lux uses opencl 2.0 features

    About Ryzen I'm not sure dual channel memory is a limiting factor for rendering. The 24 PCIe lane limi however is really disappointing.if you want to use more than 2 GPU

  • kyoto kidkyoto kid Posts: 40,593

    ...however Iray does not recognise OpenCL only CUDA so moot point.

  • kyoto kidkyoto kid Posts: 40,593

    ...still, "Lottery hardware" considering to run anything from Daz you need the Windows Server OS since Linux is not supported by Daz.

  • JCThomas said:

    I bought a Ryzen R7 1800X, but haven't put a build together yet. And I'm sorry to say, that I'm not quite sure I'm going to keep it.

    All of the current reviews and benchmarks show it to be an awesome chip, as are the other two models. The lack of 1080p gaming couldn't matter less to me, and frankly I think it's silly to to be disappointed, the way most reviewers have been, in that regard. The 1800X competes with the 6900K, but what I feel like most people are overlooking is that 1700X and 1700 also compete with the 6900K, performance wise. Reviews are revealing that all three chips are basically the same, with more strict binning for the more expensive chips. The only benchmark with ryzen that gives me pause for thought it the SATA measurements from tweaktown. But I'd still be willing to overlook that. Anyone rendering in 3Delight should seriously consider a ryzen chip...it should scale per core as well as Cinebench, and ryzen killed that benchmark.

    For me, the real problem (and I reason I'm contemplating abandoning an AM4 build), is the lack of quad channel memory and only 20 PCIE lanes. The CPU performance is outstanding, but the chipset is really lacking in other areas that make a true workstation board. Of course, I bought the chip knowing that already. But it's almost as if Ryzen doesn't know what it's trying to be.

    Interesting. With only 20 pci lanes (and thus a max of two GPU's @ x8), it sounds like Ryzen is really limited in terms of being a viable GPU powerhouse.

     

    -P

  • Takeo.KenseiTakeo.Kensei Posts: 1,303
    JCThomas said:

    I bought a Ryzen R7 1800X, but haven't put a build together yet. And I'm sorry to say, that I'm not quite sure I'm going to keep it.

    All of the current reviews and benchmarks show it to be an awesome chip, as are the other two models. The lack of 1080p gaming couldn't matter less to me, and frankly I think it's silly to to be disappointed, the way most reviewers have been, in that regard. The 1800X competes with the 6900K, but what I feel like most people are overlooking is that 1700X and 1700 also compete with the 6900K, performance wise. Reviews are revealing that all three chips are basically the same, with more strict binning for the more expensive chips. The only benchmark with ryzen that gives me pause for thought it the SATA measurements from tweaktown. But I'd still be willing to overlook that. Anyone rendering in 3Delight should seriously consider a ryzen chip...it should scale per core as well as Cinebench, and ryzen killed that benchmark.

    For me, the real problem (and I reason I'm contemplating abandoning an AM4 build), is the lack of quad channel memory and only 20 PCIE lanes. The CPU performance is outstanding, but the chipset is really lacking in other areas that make a true workstation board. Of course, I bought the chip knowing that already. But it's almost as if Ryzen doesn't know what it's trying to be.

    Interesting. With only 20 pci lanes (and thus a max of two GPU's @ x8), it sounds like Ryzen is really limited in terms of being a viable GPU powerhouse.

     

    -P

    It's 24 so that means max 3 GPU at 8x without NVme SSD or any device that would use PCIe

    To be fair that's OK for most users because I don't think that there are so many people who can go tri GPU

    A Sata SSD is still quick enough

  • JCThomasJCThomas Posts: 254
    JCThomas said:

    I bought a Ryzen R7 1800X, but haven't put a build together yet. And I'm sorry to say, that I'm not quite sure I'm going to keep it.

    All of the current reviews and benchmarks show it to be an awesome chip, as are the other two models. The lack of 1080p gaming couldn't matter less to me, and frankly I think it's silly to to be disappointed, the way most reviewers have been, in that regard. The 1800X competes with the 6900K, but what I feel like most people are overlooking is that 1700X and 1700 also compete with the 6900K, performance wise. Reviews are revealing that all three chips are basically the same, with more strict binning for the more expensive chips. The only benchmark with ryzen that gives me pause for thought it the SATA measurements from tweaktown. But I'd still be willing to overlook that. Anyone rendering in 3Delight should seriously consider a ryzen chip...it should scale per core as well as Cinebench, and ryzen killed that benchmark.

    For me, the real problem (and I reason I'm contemplating abandoning an AM4 build), is the lack of quad channel memory and only 20 PCIE lanes. The CPU performance is outstanding, but the chipset is really lacking in other areas that make a true workstation board. Of course, I bought the chip knowing that already. But it's almost as if Ryzen doesn't know what it's trying to be.

    Interesting. With only 20 pci lanes (and thus a max of two GPU's @ x8), it sounds like Ryzen is really limited in terms of being a viable GPU powerhouse.

     

    -P

    It's 24 so that means max 3 GPU at 8x without NVme SSD or any device that would use PCIe

    To be fair that's OK for most users because I don't think that there are so many people who can go tri GPU

    A Sata SSD is still quick enough

    You're right, there are 24 Lanes. However, looking into it further, it seems that there aren't any motherboards that will run that 3rd pcie slot at X8 speeds. And the way the slots are spaced out, you wouldn't be able to fit a dual slot gpu on that bottom slot anyway unless your case has at least 8 expansion slots. The  Ryzen CPU itself delivers 16 PCIE lanes for graphics, which can operate at X16 or X8/x8. Then most motherboards have the PCIE x4 M.2 SSD slot, leaving 4 lanes for the bottom full length slot. But it seems even if you leave the the M.2 slot empty, that full length slot still only operates at x4 speeds. That's the case with the most expensive X370 motherboard to date, the MSI X370 Titanium: http://www.eteknix.com/msi-x370-xpower-gaming-titanium-ryzen-motherboard-review/

    And Gamers Nexus explains it a bit here: 

    I agree, it's enough for most. It's just not enough for a gpu rendering workstation.

  • nicsttnicstt Posts: 11,714

    Personally, I'm looking for a card to run the monitors, and two for rendering.

  • JCThomasJCThomas Posts: 254
    nicstt said:

    Personally, I'm looking for a card to run the monitors, and two for rendering.

    If you were willing to use this to run your monitors...https://www.newegg.com/Product/Product.aspx?Item=N82E16814127836

    you could probably put it on the bottom slot on an X370 motherboard, then have the stronger GPUs installed on PCIE slots 2 and 5. It would be running at x4 speeds, but that should be plenty to power the monitors. It's probably not an ideal setup, but if you were dying to build a ryzen system, this would be one way to achieve what you mention.

  • nicsttnicstt Posts: 11,714

    I currently use a 970 and a 980ti; 2 GB for 3 monitors is probably a stretch (2560x1440).

    I'm looking to add a 1080ti to my next build; I'm hoping Ryzen will facilitate that. I don't need a M.2 PCIe SSD - normal SSDs are fine at the moment. I wondering though, what AMD will do for dual CPU setups; they may implement (almost seems they have to) more lanes for those motherboards; it is also a good way of differentiating them. They offer value and performance in both (I hope).

  • Takeo.KenseiTakeo.Kensei Posts: 1,303
    JCThomas said:
    JCThomas said:

    I bought a Ryzen R7 1800X, but haven't put a build together yet. And I'm sorry to say, that I'm not quite sure I'm going to keep it.

    All of the current reviews and benchmarks show it to be an awesome chip, as are the other two models. The lack of 1080p gaming couldn't matter less to me, and frankly I think it's silly to to be disappointed, the way most reviewers have been, in that regard. The 1800X competes with the 6900K, but what I feel like most people are overlooking is that 1700X and 1700 also compete with the 6900K, performance wise. Reviews are revealing that all three chips are basically the same, with more strict binning for the more expensive chips. The only benchmark with ryzen that gives me pause for thought it the SATA measurements from tweaktown. But I'd still be willing to overlook that. Anyone rendering in 3Delight should seriously consider a ryzen chip...it should scale per core as well as Cinebench, and ryzen killed that benchmark.

    For me, the real problem (and I reason I'm contemplating abandoning an AM4 build), is the lack of quad channel memory and only 20 PCIE lanes. The CPU performance is outstanding, but the chipset is really lacking in other areas that make a true workstation board. Of course, I bought the chip knowing that already. But it's almost as if Ryzen doesn't know what it's trying to be.

    Interesting. With only 20 pci lanes (and thus a max of two GPU's @ x8), it sounds like Ryzen is really limited in terms of being a viable GPU powerhouse.

     

    -P

    It's 24 so that means max 3 GPU at 8x without NVme SSD or any device that would use PCIe

    To be fair that's OK for most users because I don't think that there are so many people who can go tri GPU

    A Sata SSD is still quick enough

    You're right, there are 24 Lanes. However, looking into it further, it seems that there aren't any motherboards that will run that 3rd pcie slot at X8 speeds. And the way the slots are spaced out, you wouldn't be able to fit a dual slot gpu on that bottom slot anyway unless your case has at least 8 expansion slots. The  Ryzen CPU itself delivers 16 PCIE lanes for graphics, which can operate at X16 or X8/x8. Then most motherboards have the PCIE x4 M.2 SSD slot, leaving 4 lanes for the bottom full length slot. But it seems even if you leave the the M.2 slot empty, that full length slot still only operates at x4 speeds. That's the case with the most expensive X370 motherboard to date, the MSI X370 Titanium: http://www.eteknix.com/msi-x370-xpower-gaming-titanium-ryzen-motherboard-review/

    And Gamers Nexus explains it a bit here: 

    I agree, it's enough for most. It's just not enough for a gpu rendering workstation.

    That's why a X99 platform is still the best choice for a CPU/GPU rendering station.

    The third slot should only be used for driving the monitor.

     

    nicstt said:

    Personally, I'm looking for a card to run the monitors, and two for rendering.

     

    Best solution is to put a single slot GPU on the third slot. Many "entry level" Quadro are single slot.

    https://www.amazon.com/PNY-Video-Graphics-Cards-VCQK620-PB/dp/B00MO4RDBY

    https://www.amazon.com/PNY-NVIDIA-Quadro-K1200-VCQK1200DP-PB/dp/B00UPHAT2C

    https://www.amazon.com/PNY-NVIDIA-Quadro-Graphics-VCQK2200-PB/dp/B00MPXRYAE

    https://www.amazon.com/PNY-QUADRO-M2000-GDDR5-NVIDIA/dp/B01ELL2FTC

    https://www.amazon.com/PNY-Video-Graphics-Cards-VCQM4000-PB/dp/B014J7QXS2

    Elsa also made a single slot GTX 1050 Ti but can't find it outside from japan for the moment http://www.mobipicker.com/worlds-thinnest-gtx-1050-ti-uses-single-slot-cooling-system-sale/

    Best value for money may be this one https://www.amazon.com/XFX-RADEON-SINGLE-Graphic-RX-460P4TFG5/dp/B01MU015Q1

    You may get a speed penalty running at 4x but that shouldn't be that much of a problem.

  • Unless Pascal has changed that I believe it is not possible/wise to mix Quadro and GTX cards due tio driver conflicts.

  • Takeo.KenseiTakeo.Kensei Posts: 1,303

    Unless Pascal has changed that I believe it is not possible/wise to mix Quadro and GTX cards due tio driver conflicts.

    Just a inf mod to get the quadro use geforce driver?

  • JCThomasJCThomas Posts: 254

    Unless Pascal has changed that I believe it is not possible/wise to mix Quadro and GTX cards due tio driver conflicts.

    Yeah...if you've somehow saved all of the drivers for both quadro and gtx cards, you might be able to find a combination that is somewhat stable after installing, uninstalling, and reinstalling the drivers dozens and dozens of times. Totally not worth the headache. Sadly, Ryzen doesn't seem to be able to compete with X99 when it comes to GPU workstations. 2 1080 TIs on an X370 board would be awesome for sure, but, but 3 or 4 on an X99 board would be better.

    Just a inf mod to get the quadro use geforce driver?

    Can you elaborate on this perhaps? Or point us to a totutorial?

  • nonesuch00nonesuch00 Posts: 17,944
    edited March 2017

    I searched on Amazon for pre-built Ryzen and all I could find were overpriced at $900 desktops with the most minimal new video card they could install; maybe Ryzen products will improve in fair price later but they aren't there yet. There are Intel i5 Gen 7 pre-builts out there with 4 cores and 8 threads from Acer that can be had at less than 1/2 the price of those Cybertron Ryzen desktops so if AMD's strategy was to throw out twice the cores out there for the same price as Intel powered product offerings then at twice the cores at twice the price Cybertron has blow it. People that will pay that will quietly built their own PC and avoid the throwaway entry-level video card. $550 - $600 would have been comptive prices because no matter how cheap you go you still have to add an entry level video card - something builds of PCs with modern Intel CPUs don't have to contend with and a huge selling point for Intel powered PCs over AMD.

    Post edited by nonesuch00 on
  • I amused myself last night by customising a system at Scan - 1800X, one 1080Ti, 64GB RAm, plus fairly standard other stuff. Came to £2,700. I'm not sure I'd dare breathe around a system costing that much.

  • DustRiderDustRider Posts: 2,692
    edited March 2017

    You need to compare the Ryzen with i7 6900k systems (8 core - 16 threads).  The i7 6900k  processor alone will run about $1,000, then you have to build the system to match. CPU performane of an i5 won't even come close to a Ryzen 1700.

    Edit: Well Duh! The newest i5 do match up pretty well in single thread mode, but the 1700 still clearly wins over the i5 and the i7 6900K/6950K in multitread performance (http://wccftech.com/amd-ryzen-7-1700x-6950x-7700k-cpu-benchmarks/). If the Windows 10 issues get resolved with Ryzen, the performance difference will probably increase. Only time will tell. Thanks for pointing out how well the i5's are performing!

    Post edited by DustRider on
  • nonesuch00nonesuch00 Posts: 17,944
    DustRider said:

    You need to compare the Ryzen with i7 6900k systems (8 core - 16 threads).  The i7 6900k  processor alone will run about $1,000, then you have to build the system to match. CPU performane of an i5 won't even come close to a Ryzen 1700.

    Edit: Well Duh! The newest i5 do match up pretty well in single thread mode, but the 1700 still clearly wins over the i5 and the i7 6900K/6950K in multitread performance (http://wccftech.com/amd-ryzen-7-1700x-6950x-7700k-cpu-benchmarks/). If the Windows 10 issues get resolved with Ryzen, the performance difference will probably increase. Only time will tell. Thanks for pointing out how well the i5's are performing!

    Yeah, I know what you are saying but that is now exactly wowing anyone to be saving 100 or 200 dollars on a much more road testing and stable Intel 8-core 7th Gen CPU. Should I be paying nearly the same price for the new Ryzen as the equivalent Intel when the equivalent Intel has the track record already & already works with Windows 10 and the case, eyboard, and mouse in the desktop can be had for less than $50 by the desktop manufacturers? That leaves the motherboard and RAM so how are they building systems nearly as expensive as Intel 7th Gen 8-core desktop servers? They aren't. They are making much bigger mark-ups on the product then they'd do with similarly configured with Intel CPUs.

    DAZ doesn't easily win against Turbo Squid by offering badly broken products at excessively high prices that TurboSquid does. They offer consumer prices of superior models that almost alway work.

    AMD needs to partner with Acer or Asus or some other electronics commodity manufacturer and get the price down to entry level Intel desktop range for desktops with the Ryzen 1700 if they sincerely think it's a market that can compete in. Happy and impressed home consumers do make buying descisions for business and government endeavors after all. Of course they'll need to let folk buy the desktop configure without a video card to compete on price.

  • kyoto kidkyoto kid Posts: 40,593

    ...NVme SSDs are usually best suited for handling very large data sets, liks climate or geolocig  much larger than what we work with.

    I amused myself last night by customising a system at Scan - 1800X, one 1080Ti, 64GB RAm, plus fairly standard other stuff. Came to £2,700. I'm not sure I'd dare breathe around a system costing that much.

    ...converting to USD (3,348$) I can build my dual 8 core Xeon 128 GB rendering monster, along with upgrading my current work system to 24 GB with a standard 1080 for about the same cost.

  • Robert FreiseRobert Freise Posts: 4,279

    I built one using a 1700x and a Nvidia 1070 64 GB ram  a Rosewill case EVGA 1000w power supply Samsung NVme SSD AsRock Mb plus an extra 4TB seagate hd for less than $2000.00.

    I 've tested it against my dual Xenon system and it actually beats it in CPU only by 5 mins gpu only it's 2min 40 sec faster cpu and 1 gpu it's 24 sec faster but with 2 gpus it's actually 14 sec slower which I suspect is due to the PCIe slot differance as the Xenon slots run at 16 x as opposed to the 8x that the am4 decreases to with 2 gpus

  • kyoto kid said:

    ...NVme SSDs are usually best suited for handling very large data sets, liks climate or geolocig  much larger than what we work with.

    I amused myself last night by customising a system at Scan - 1800X, one 1080Ti, 64GB RAm, plus fairly standard other stuff. Came to £2,700. I'm not sure I'd dare breathe around a system costing that much.

    ...converting to USD (3,348$) I can build my dual 8 core Xeon 128 GB rendering monster, along with upgrading my current work system to 24 GB with a standard 1080 for about the same cost.

    You need to knock a sixth off for VAT, since our prices include the sales tax up front. We also tend to pay more than the exchange rate would indicate for a lot of IT stuff, though I don't know if that's the case here.

  • nicsttnicstt Posts: 11,714
    kyoto kid said:

    ...NVme SSDs are usually best suited for handling very large data sets, liks climate or geolocig  much larger than what we work with.

    I amused myself last night by customising a system at Scan - 1800X, one 1080Ti, 64GB RAm, plus fairly standard other stuff. Came to £2,700. I'm not sure I'd dare breathe around a system costing that much.

    ...converting to USD (3,348$) I can build my dual 8 core Xeon 128 GB rendering monster, along with upgrading my current work system to 24 GB with a standard 1080 for about the same cost.

    You need to knock a sixth off for VAT, since our prices include the sales tax up front. We also tend to pay more than the exchange rate would indicate for a lot of IT stuff, though I don't know if that's the case here.

    Indeed. It's been pretty common that we paid pounds the same as its dollar equivalent. Likely that we'll be paying more now.

    ... And I've been doing similar myself at Scan. :)

  • kyoto kidkyoto kid Posts: 40,593
    edited March 2017
    kyoto kid said:

    ...NVme SSDs are usually best suited for handling very large data sets, liks climate or geolocig  much larger than what we work with.

    I amused myself last night by customising a system at Scan - 1800X, one 1080Ti, 64GB RAm, plus fairly standard other stuff. Came to £2,700. I'm not sure I'd dare breathe around a system costing that much.

    ...converting to USD (3,348$) I can build my dual 8 core Xeon 128 GB rendering monster, along with upgrading my current work system to 24 GB with a standard 1080 for about the same cost.

    You need to knock a sixth off for VAT, since our prices include the sales tax up front. We also tend to pay more than the exchange rate would indicate for a lot of IT stuff, though I don't know if that's the case here.

    ...that would still make the cost almost 2,800$ US. So I couldn't get the 1080 for the current system, but I could still upgrade its memory and build the duo Xeon render monster.

    Post edited by kyoto kid on
  • JCThomasJCThomas Posts: 254

    I don't think anyone is arguing that Ryzen isn't the best bang-for-buck for multi-threaded performance. Heck, that doesn't even need to be qualified with "bang for buck," it's just better than intel's offerings, including the 6900K. Nor do I think anyone is suggesting (on these forums, I mean) that the single core performance on Ryzen is bad...it's just objectively not quite as good as intel's higher clocked CPUs, which makes sense.

    The problem is that Ryzen and the X370 chipset don't make a great backbone for a GPU render station that something like Iray or Octane could take advantage of. Ryzen takes on the highend desktop segment in the cpu, but as a system, it tackles mid-range gaming products.

    If Asus could release an X370 WS motherboard with at least one PLX chip, that would be amazing.

    Ryzen will be outstanding for 3Delight, but for iray, it's wasting dollars that could be used on a gpu upgrade.

    But it could be a cheap way to make a render farm in Carrara...a bunch of R7 1700s and some cheap B350 motherboards might be able to take on super high end Xeon.

  • kyoto kidkyoto kid Posts: 40,593
    edited March 2017

    ...yeah, but just getting burned out by the "GPU Arms Race" as it apprently has become.  That is why I am looking at a dual Xeon high memory build.  My scenes would require the resources of a Quadro P5000, the price for which would almost cover my dual Sandy Bridge Xeon render system build which would be useful for Iray, Carrara, Renderman/3DL, and Vue Infinite.  GPU rendering is still something relatively new.  It's one downside is that you need a lot of video memory and high core count to get the best performance which is more expensive than standard physical memory sticks.

    Post edited by kyoto kid on
  • Takeo.KenseiTakeo.Kensei Posts: 1,303
    edited March 2017

    I searched on Amazon for pre-built Ryzen and all I could find were overpriced at $900 desktops with the most minimal new video card they could install; maybe Ryzen products will improve in fair price later but they aren't there yet. There are Intel i5 Gen 7 pre-builts out there with 4 cores and 8 threads from Acer that can be had at less than 1/2 the price of those Cybertron Ryzen desktops so if AMD's strategy was to throw out twice the cores out there for the same price as Intel powered product offerings then at twice the cores at twice the price Cybertron has blow it. People that will pay that will quietly built their own PC and avoid the throwaway entry-level video card. $550 - $600 would have been comptive prices because no matter how cheap you go you still have to add an entry level video card - something builds of PCs with modern Intel CPUs don't have to contend with and a huge selling point for Intel powered PCs over AMD.

    You're not doing a good comparison. Wait till the Ryzen 1400x is out to compare with an i5

    And if you really want to compare, I doubt any i5 can run Iray as well as the actual Ryzen in CPU mode.

     

    JCThomas said:
     

    Just a inf mod to get the quadro use geforce driver?

    Can you elaborate on this perhaps? Or point us to a totutorial?

    A long time ago in a galaxy far far away, it was possible to softmod a geforce so that you could make it use quadro drivers and get performance of a quadro in professionnal applications. That stopped with the geforce GT 8800 as Nvidia limited that practice since that was harming it's professionnal GPU market. However I don't think they limited the other way. As Quadro cards often have a similar desktop counterpart there is a possibility to install a geforce driver instead of a quadro driver. All driver package contain an inf file that tells which driver to install for which hardware. The idea is to modify that inf file to allow the installation of the driver. Basically you take your GPU ID, and put it in the inf file at the place of the Desktop counterpart of your Quadro

    There is a guide here http://forum.notebookreview.com/threads/quadro-driver-tweak-how-to-make-your-quadro-a-geforce-driver-side.253667/

    As that is only software modification, risks are minimal. However I don't know if Nvidia prevents the practice or not.

    Doing so, you'll only have to install Geforce driver once.

    There is also a another way to do things but I don't think I've seen anybody doing it ; Modify the installation path of the Quadro Drivers to prevent conflict. That way you would benefit from Quadro's specific advantages if you have some use of them

    Post edited by Takeo.Kensei on
  • Kendall SearsKendall Sears Posts: 2,995

    I searched on Amazon for pre-built Ryzen and all I could find were overpriced at $900 desktops with the most minimal new video card they could install; maybe Ryzen products will improve in fair price later but they aren't there yet. There are Intel i5 Gen 7 pre-builts out there with 4 cores and 8 threads from Acer that can be had at less than 1/2 the price of those Cybertron Ryzen desktops so if AMD's strategy was to throw out twice the cores out there for the same price as Intel powered product offerings then at twice the cores at twice the price Cybertron has blow it. People that will pay that will quietly built their own PC and avoid the throwaway entry-level video card. $550 - $600 would have been comptive prices because no matter how cheap you go you still have to add an entry level video card - something builds of PCs with modern Intel CPUs don't have to contend with and a huge selling point for Intel powered PCs over AMD.

    You're not doing a good comparison. Wait till the Ryzen 1400x is out to compare with an i5

    And if you really want to compare, I doubt any i5 can run Iray as well as the actual Ryzen in CPU mode.

     

    JCThomas said:
     

    Just a inf mod to get the quadro use geforce driver?

    Can you elaborate on this perhaps? Or point us to a totutorial?

    A long time ago in a galaxy far far away, it was possible to softmod a geforce so that you could make it use quadro drivers and get performance of a quadro in professionnal applications. That stopped with the geforce GT 8800 as Nvidia limited that practice since that was harming it's professionnal GPU market. However I don't think they limited the other way. As Quadro cards often have a similar desktop counterpart there is a possibility to install a geforce driver instead of a quadro driver. All driver package contain an inf file that tells which driver to install for which hardware. The idea is to modify that inf file to allow the installation of the driver. Basically you take your GPU ID, and put it in the inf file at the place of the Desktop counterpart of your Quadro

    There is a guide here http://forum.notebookreview.com/threads/quadro-driver-tweak-how-to-make-your-quadro-a-geforce-driver-side.253667/

    As that is only software modification, risks are minimal. However I don't know if Nvidia prevents the practice or not.

    Doing so, you'll only have to install Geforce driver once.

    There is also a another way to do things but I don't think I've seen anybody doing it ; Modify the installation path of the Quadro Drivers to prevent conflict. That way you would benefit from Quadro's specific advantages if you have some use of them

    Quadros will run with the GeForce drivers, HOWEVER they will stall horribly.  There are dedicated firmware enhancements within the Quadros that speed up OpenGL and other operations.  Use of the GeForce drivers will trigger the Quadro to try to lookup the firmware operations, fail, and then execute the driver code in software (this is especially a problem with software that recognizes the Quadro and tries to use optimized code).  Yes, it is designed that way on purpose.  It is this type of issue that keeps the GeForce series from being "certified" by the professional level software (solidworks, Autodesk software, etc).

    Kendall

  • nonesuch00nonesuch00 Posts: 17,944
    edited March 2017

    Problem is it wasn't me that did the comparing. It was AMD and they indeed sell they Ryzen CPUs for 1/2 of the comparible intel CPU with 8 cores and yet the Ryzen desktops price advantage magically almost completely disappears by the time all the other desktop componets are added in.

    These builders of desktops. it's their business and if they want to take advantage of a niche market of compulsive HW upgraders they are welcome to, I just wait buy until they are priced more like Intel entry level dewsktops from Acer.

    Post edited by nonesuch00 on
  • SoneSone Posts: 84

    I put the ready to render scene in Slaying the Dragon and switched it to 3delight. $362.00 buck Ryzen 1700 all default.

    first run with optimizing files
    2017-03-23 20:10:31.947 Total Rendering Time: 3 minutes 29.60 seconds
    second run
    2017-03-23 20:17:03.546 Total Rendering Time: 2 minutes 31.62 seconds

     

    slayingthedragon.jpg
    1280 x 691 - 341K
  • nonesuch00nonesuch00 Posts: 17,944
    Sone said:

    I put the ready to render scene in Slaying the Dragon and switched it to 3delight. $362.00 buck Ryzen 1700 all default.

    first run with optimizing files
    2017-03-23 20:10:31.947 Total Rendering Time: 3 minutes 29.60 seconds
    second run
    2017-03-23 20:17:03.546 Total Rendering Time: 2 minutes 31.62 seconds

     

    Interesting, why don't you post the iRay test scene in the other thread in the Commons. Here are the results when I run that scene:

    HP 8450P Elitebook, 3rd Generation i5 with Intel HD Graphics 3000 (2 cores - 4 threads), 16 GB RAM

    Scene rendered as is from the download, no changes.

    2016-12-30 17:56:17.811 Iray INFO - module:category(IRAY:RENDER):   1.0   IRAY   rend info : Received update to 01946 iterations after 7201.491s.
    2016-12-30 17:56:17.819 Iray INFO - module:category(IRAY:RENDER):   1.0   IRAY   rend info : Maximum render time exceeded.

    2016-12-30 17:56:18.407 Finished Rendering
    2016-12-30 17:56:18.507 Total Rendering Time: 2 hours 4.2 seconds

    The rendering dialogue claimed to be about 92% converved with the 2 hour time limit ran out.

  • kyoto kidkyoto kid Posts: 40,593

    Problem is it wasn't me that did the comparing. It was AMD and they indeed sell they Ryzen CPUs for 1/2 of the comparible intel CPU with 8 cores and yet the Ryzen desktops price advantage magically almost completely disappears by the time all the other desktop componets are added in.

    These builders of desktops. it's their business and if they want to take advantage of a niche market of compulsive HW upgraders they are welcome to, I just wait buy until they are priced more like Intel entry level dewsktops from Acer.

    ....or do what I did, learn how to build a system  I built my current one for less than 2/3rds the price it would have cost as a pre- or custom build .  Took me 18 months in my spare time to learn how but that turned out to be time wisely spent as it is knowledge and experience that just doesn't "go away" after you used it once.  As I mentioned, am now working on the plans for a dual 8 core Xeon high memory render beast which will be about the same cost as a single Quadro P5000 GPU.

Sign In or Register to comment.