Will AMD GPUs Support Arrive? (New GPU Leak)
LenioTG
Posts: 2,118
I know that Iray doesn't support AMD cards, and that there are many ways to render without an nVidia GPU (but I could just not have any GPU at all if I wanted to render CPU-only), and I like to use Daz Studio (also because I've already a nice library, and I don't want to learn other softwares).
My question is: will, eventually, the support for AMD GPUs come?
I'm asking this because of the leak regarding the new AMD GPUs: they're too convenient!!!
I'd rather buy a RX 3080 (7nm Navi8GB DDR6 150W, as fast as a GTX 1080) at 250$ than a GTX 1080, that in my Country is currently sold at 632€ (720$, or 3 times the price for the same performances...but I know that also the RX 3080 will cost more here, and that the 1080 price will go down after this release).
So, right now, considering that it's also much easier to get a FreeSync monitor, Daz Studio is the only thing that's keeping me from switching to AMD GPUs.

Comments
Iray is by NVIDIA and they have closed that to just CUDA. There is no OpenCL version of Iray and I don't think there ever will.
AMD has put out a competitor to Iray called ProRender but I also highly doubt that is coming to DAZ Studio.
The other option is to buy multiple CPU system and render CPU-only,
but I doubt, it will be cheaper, than getting Nvidia GPU.
Anyhow it will be interesting to see, if anyone could reach the speed in iray on Nvidia GPU, while rendering in CPU-only mode.
The advantage of such system is, that one can use all available RAM for rendering.
If AMD will release cheap 64 core 6 GHz processors, then the speed of rendering on the CPU could be similar
to that with the simple Nvidia GPU.
That's very sad to hear, let's hope ProRender is coming to Daz Studio!
That is not at option: the whole point is to save money! xD
I've seen that the new CPUs from AMD could have 12 cores and 24 threads (Ryzen 7 3700, for 299$)
At the prices Nvidia is charging for 2080it's AMD's TR platform has to start being considered as a possible alternative. I haven't dug through the benchmark thread for someone with a 2990XE who ran the benchmark cpu only but I would expect it to underperform a 2080ti by a big margin. However we can expect a Zen 2 Threadripper launch next year and who knows what the top end chip's clock speed and thread count will be. If AMD holds the line on pricing and keeps that chip at $2k USD it might be an an option for people unwilling to invest in Nvidia. Note the total platform cost likely wouldn't be lower just who would be getting your money.
Do not think for even one minute that a used dual socket MoBo Xeon rig is ever likely going to get you performance approaching a 2080ti. First those high core count Xeon's have very low clock speeds. Next they run on DDR3 RAM which is much slower than DDR4 or GDDR6. Combine those with other details of those chips, anemic cache, low chipset bandwidth etc., and the platform makes sense as budget option but don't convince yourself it will ever achieve results beyond its capabilities.
I think so, but they still have warehouses full of old GPUs, and their productive process is still at 12nm...then the 2070, 2080 and 2080Ti are already out, it wouldn't make sense to create a 2060 as fast as a 2070 at a much lower cost...we'll see! Surely, Intel had to adjust to the Ryzen CPUs 2 years ago!
To me render is a hobby, I do something else for a living, which doesn't require a powerful PC...I'm still with a 1060 3Gb, I can't even think about getting a 2080 Ti or a dual socket mobo!
It would be cool, though, if Daz Studio had this "ProRender", and we could use any GPU we wanted!
My Vega 64 is waiting for it desperately.
...three options for AMD GPU rendering.
1. LuxCoreRender [The latest version of Reality (4.3) does not support the new LuxCore engine just the old LuxRender 1.6 which was abandoned by developers. There is indication Paolo is working on updating the plugin but the only word I've seen is "later this year" which is about where we are at and not sure if it includes the Daz3D version).].
2. Octane 4.0 2019 release. [As I understand, Otoy development is looking at supporting AMD GPUs this through the Vulkan API. Otoy offers both a free and paid (20$/month) subscription tracks (the paid track includes the Daz plugin).]
3. The aforementioned AMD ProRender [Not supported by Daz3D so you are on your own programming a plugin yourself or having to export to other 3D software like Blender or Unreal Studio.]
But Nvidia could not possibly just sit by and let AMD under cut them by that much. Aside, while Nvidia is charging a lot for new cards, we do not know what their margins are. The 3rd party vendors have very low margins, but we do not know what Nvidia is marking them up. I personally believe they have a hefty margin. Some people have alleged that Nvidia's Turing prices are as such to help clear out Pascal, which is an opinion I also share. And yes, they have lots of Pascal left, as brand new 1070's are still coming to market this week. It is pretty much unheard of for new GPUs from the LAST generation to still be launching 3 months after a new product launch.
Except for 1080ti's, which were very quickly snapped up by savvy buyers (including me:) ) I am so very happy I snagged one when I did. I'd like to get another one if ebay prices drop again.
May is a long way off still. It almost certain that the 2060 will be out by then, it never takes more than 6 months for x60 models to launch. The excess stock should be gone by then as well. I would bet that Nvidia is trying to watch for what AMD is doing to prepare their price structure. and possibly even final specs. Many have believed that Nvidia would launch the 2060 as the GTX 2060, with no RT or Tensor because of how stripped down it would be. But maybe they will keep these cores in tact instead to ensure every advantage. Tensor cores in particular really can impact game performance, allowing the card to punch well above its normal weight.
Another thing to note is your computer doesn't have to be a power house for Iray. As long as your GPU is fast and has enough VRAM, that is seriously all you need. Everything else is a bonus for making Daz maybe run better, but only the GPU matters once the render button has been hit.
Your only options are to either buy a high end Nvidia, or use a different render engine. You could go crazy and build a CPU render machine, but the cost of doing so would be even higher, and it would still be slower than a very good GPU. But all of these things are also relative. What you consider rendering fast may be different from others, and how large your scenes are may be very different from others. I think a 1070ti is one of the better things going, with 8GB VRAM and it renders right on par with a regular 1080.
AMD has the potential to completely upend the entire computer industry with its Ryzen technology. The capability to make any cpu or gpu nearly infinitely scalable through chiplets and the infinity fabric interconnect tech means they could potentially undercut both Intel and Nvidia's prices and simply pile on the processing power to make up for clock speed deficits until they catch up on that. All at a lower cost to the consumer.
what this means to people like us stuck in a proprietary technology is that Nvidia is likely to soon feel the same pressure Intel has been feeling the last two years. That means either lower costs or new innovations to make their products actually worth buying. Daz will also like start feeling pressure to support Prorender as more Radeon cards enter the market.
A company with a frequently used render engine tried to bring Cuda support to AMD cards.
There was an almost ready and finished build.
Unfortunately several AMD employees working on the project left AMD in the last quarter of 2017 and the first quarter of 2018.
- - -
Please use google for more background information on this.
keywords: "amd cuda employee left"
When a few key people leave this may delay software and technology development in rather significant ways...
- - -
With recent decisions of Apple OpenCL & OpenGL support is considered as "dead" by some as well.
Because of that to focus on Metal or Vulkan is by some considered as the only option beyond CUDA.
Check google for more information on this as well...
- - -
My subjective impression is that Nvidia is actively trying to advance the field of GPU rendering.
Because of that I find the right thing to do for me is to support Nvidia by purchasing their hardware technology.
- - -
Thank you for your post, very helpful!
Sadly, as I was saying, rendering is an hobby to me! I'm willing to pay for the Platinum Club etc, but 20$ a month for the option to use Daz products in Octane seems is too much for me!
As for the other options, I'm not able to write code, and as I sai I'd prefer to continue to use Daz Studio (also because if I had to buy my library again, it would cover the price difference with Nvidia GPUs...)
Yes, I think this too!
Sadly I only got a 3Gb VRAM GPU, but in the future I'm planning to switch to 2160p, so I'll need a better GPU, and if the price/performance ratio will be so different, I don't think I'll choose Nvidia again!
This is what I was thinking when I started this thread!
Okay, I get it, when Daz included Iray no one had an AMD GPU, and with the mining stuff that trend continued, but now it's going to change, so I hope they'll support Prorender!
I didn't know about the employees! Does that mean that CUDA is coming for AMD cards too, in a foreseable future?
Well, if they use Vulkan and not Open CL, and they're the only other company that's making GPUs (for now, at least...Intel said they're going to join this market).
Your opinion is very respectable, to support companies that innovate! Altough, I imagine that the number of Nvidia GPUs destined to render is very little, and maybe they're doing this just as a side project!
Anyway, we'll see if Nvidia will react to this AMD launch with a very good price drop. But, in any case, it would be cool to have the possibility to use also other GPUs!
AMD is not and was not developing gpu's with CUDA. Nvidia won't license the patent to them. I have no idea what news article linvanchene found but I found nothing of the sort. CUDA gives Nvidia a huge leg up in compute performance which is what those Quadro cards that sell for really eye popping prices do. That's where Nvidia makes it's money these days.
It is Otoy who does the work trying to make their render engine useable for all GPU.
Source from March 2016:
https://www.extremetech.com/computing/224599-gpu-computing-breakthrough-cloud-rendering-company-claims-to-run-cuda-on-non-nvidia-gpus
This project refered to in the article hit a dead end when AMD employees assigned as contact persons to the project left.
Some went on to work for Nvidia others for Intel...
This is the latest information about AMD support for Octane I was able to find:
"We’ve actually gotten a large chunk of Octane 2019 working on the framework we showed in March (VK/Metal backend), and Octane 2019 releases will ship with more and more pieces we can now offload to AMD/Intel/iOS GPUs, but until we have ported every line of CUDA in Octane, you may still need at least one NV GPU for all features to work. We’ll post regular progress updates as we get further into these releases."
Sources:
https://render.otoy.com/forum/viewtopic.php?f=30&t=69884&p=352494&hilit=amd#p352479
- - -
My subjective impression:
AMD support may work out or may not for some render engines.
Nvidia, AMD and Intel would need to agree to mutually support each other and use common standards to change the situation for the future.
Currently it is 3rd parties like Otoy who must do most of the work trying to support different GPU and operating systems.
But even if it does work it may not provide the same speed increases and memory pooling that Nvidia RTX cards could offer based on the information available now.
- - -
It might help if Nvidia and DAZ3D would finally share some information about the plans for Iray so users do not have to look for alternative solutions all over the place...
Daz 3D has supported Nvidia Iray since 2015.
It cannot be expected that existing customers now switch from Nvidia to AMD after they have spent serious amounts of money on Nvidia cards.
New users should be informed properly to buy an Nvidia card and not AMD on the Daz 3D front page.
- - -
As far as I am aware pro Render works on nVidia cards as well as AMD - the main obstacle would be material support. There's no need to wait for Daz to support ProRender, the tools to make a new render plug-in for DS are available to all - I don't know if the same is true for Pro Render, but I thought it was.
Excuse me Richard Haseltine, I've not understood your post: is it already possible to do so? Which are the tools to make a plugin for AMD GPUs?
The material support is the crucial point.
Users spent now countless money for products with the Material Definition Language for Iray by Nvidia.
Customers may not be willing to put all that in the garbage bin when someone deceides that prorender now suddenly needs to be supported.
- - -
Otoy invested a huge amount of money into the development of the OctaneRender for DAZ Studio plugin.
And DAZ3D and the published artists ended up not supporting Octane nevertheless.
Those users who invested in Octane since 2013 were left abanndoned when Iray was introduced in 2015 as the new default render engine without prior warning.
- - -
Maybe one or two products could end up in the store
https://www.daz3d.com/octane-hair-shaders-for-carrara
https://www.daz3d.com/tropical-bundle-for-daz-studio-plugin-octanerender
The majority of published artists will very likely claim that it is too much to ask for support for another material / shader language.
Anyone who creates a ProRender plugin would risk a similar fate.
All those users who are asking for pro render support now may very likely be those types of users who want one click solutions and expect the materials to look the same in pro render.
MDL materials looking identical in Pro render seems unlikely until Pro Render supports the Material Definition Language by Nvidia.
- - -
I put this thought into a separate post because the rest turned out tldr anyway:
- - -
Content creators should not need to support many different material or shader languages.
Instead render engine producers should support as many material and shader languages like MDL or OSL as possible.
MDL is open source since August 2018:
https://blogs.nvidia.com/blog/2018/08/13/open-source-mdl-sdk/
Maybe users interested in ProRender could ask on AMD forums to add support for the Material Definition Language by Nvidia?
- - -
The thing is render engines handle materials very differently. To have Octane or Reality use MDL just might not be possible or a lot of the features of MDL would nothing which frustrate end users.
Thanks Optane for the plugin, but 20$/month is a lot!
So if AMD supported MDL, it would be simpler to add prorender in Daz Studio?
And wouldn't it be possible to have both engine in the software?
..Kamenko, You mention your GPU has 3 GB is it perhaps a Radeon HD 7970? I have two of those.
3 GB is an extremely small amount of VRAM for GPU rendering, the bare minimum suggested is 4 GB with 6 GB or more being optimal.
Now this is where that 20$ a month for Octane would be a good choice as you don't need to spend money on an uber high VRAM card because Octane uses what is kown as Out Of Core Memory meaning that once the VRAM is of teh GPU is exceeded it then uses the physical memory on your MB without dropping teh entire process to the CPU like Iray does (so it is still faster than pure CPU rendering).
You can use free Octane rendering in free Unity - no need to pay anything.
It automatically converts materials from Daz Studio, if you exporting Daz assets as .FBX
...there ya go.
This is the number one reason why would be a good thing to support AMD, because with the RX Vega 56 I can configure 24 GB of VRAM, and ProRender actually can use it. 64 GB is the full software configuration, if the PC has enough RAM. Volta and Turing is also support this kind of technology, but not with an x86/64 host CPU. :(
3Delight is still included with Daz, and has been with it for far longer than Iray has existed. When Daz introduced Iray, it was a big shift, and every single 3DL material and light preset was now unusable with Iray. And today many products release without any 3DL material presets at all. It is not a trivial task to convert Iray materials to 3DL, which is one reason why fewer PAs include them. So big render engine changes in Daz Studio are not without precedent in spite of what people had invested in. While it can be argued that Iray and GPU rendering offered a variety of improvements over 3DL, the day could come when another render engine provides dramatic benefits over Iray. Octane seems to be doing a lot of things, and if they follow through on their promises then Octane could offer many benefits over Iray.
I've been speculating for a long time that gaming engines could offer huge benefits as that tech matures. Now gaming engines can even support real time tracing. This is a real big deal, and one day soon we may see this become the norm. Epic is doing a lot of wild things now. They have their own vibrant asset store, a hugely popular and powerful game engine, and now they even have their own game store. Epic is making big moves and I predict they will become more and more of a player. It goes way past games, as I am sure you are aware of the famous Star Wars ray traced demo done in real time. Studios are using gaming engines for animation, and sometimes even full animation scenes are being produced with a gaming engine. Epic can be a major player because they have lots of money to do this and make it happen. They made a billion off just one game last year, and they are taking some of this and investing in new tech. And with ray tracing in gaming, it does not have to rely on Nvidia GPUs. While yes, Nvidia is the only one right now with it, but ray tracing is part of Microsoft DirectX 12 and is not a CUDA application. Turing has hardware optimized for it, but AMD can make their own ray tracing cards if they wish. So there is another reason to root for gaming engines. (Except for the handful of people who refuse to use Windows 10.)
Speaking of Octane and AMD, I don't know what they are doing to try and make CUDA work on AMD cards, but they are doing this on their own without AMD's sponsorship. And it has taken them a very long time. I remember those articles about CUDA possibly coming to AMD, that was over 2 years ago. They have been promising this for over 2 years.
Do not think gaming engines will replace render engines. They do all sorts of tricks to make their rendering work for them. Many of those shortcuts are fine for a single frame seen for a fraction of a second but simply aren't as still image. Note that famous raytracing demo was setup so almost all the reflections were off the metal armor. Where were the reflections of her from the surrounding objects?
I didn't say it was coming tomorrow. But just one year ago that demo was absolutely unthinkable. Now that demo can be ran on one single Turing GPU. I get the same pushback every time I mention the Star Wars demo, that this or that was not done in real time. That is missing the point. I am thinking about the future, not today, and Unreal is evolving very quickly.
But more over that video was done in real time. If you want to slow down the frame rate and render single images, the engine can be set up to do this. You can ask Unreal to cast and reflect every object in the scene today if you so choose. Obviously it will not render whatever the frame rate was in that demo, but it will certainly render faster than Iray will. And you can still record that animation, too, maybe not in real time, but you bet it will be faster than Iray.
Keep in mind that Iray is not 100% accurate to real life either, it cannot handle something as simple as a flame. How do we render flames in Iray? We have to FAKE them, so don't tell me that gaming engines need to resort to using tricks and shortcuts when every Iray user needs to take shortcuts all the time. Meanwhile a gaming engine can create much more vibrant fire effects than Iray ever hopes to achieve. Another thing that Hollywood loves are special effects, magic particles and the like. Again Iray cannot handle these with reasonable competence while gaming engines can, so it is logical that Hollywood studios are far more interested in Unreal than Iray. After all, who owns Star Wars? That Disney is looking at Unreal is a big deal in and of itself, Unreal is not making a Disney game (EA has the license), this is a joint venture between Epic and Disney. Why is Disney interested in Unreal at all when they have Pixar's Renderman? Unreal is being rapidly developed and invested in, and to be blunt Unreal has been making more strides than Iray has. A few Disney projects have used Unreal for figuring out the "accurate character tolerances" for their scenes, storyboarding in real time, setting up camera shots, and more. They keep getting more and more familiar with these tools, and as these tools keep getting better, one day a director is going to ask the execs "Hey, how about we make this feature entirely in Unreal?" Its coming, mark my words.
Both Unreal and Unity have been used in a variety of animated works already. Yes they are more cartoonish right now, but give it time. Do you think the studios using Unity and Unreal right now will suddenly stop using them? Both of these engines will keep getting better, and more and more studios will adopt them.
And one very important thing to consider, what engines got Turing ray tracing support first? Actually, when will Iray get ray tracing?
At the end of the day, the only thing that matters is the end result, and if a gaming engine becomes close enough, then a gaming engine is close enough. In the next couple of years you will begin to see some more realistic animated shorts that are animated entirely with Unreal Engine.
IRay had raytracing the day it was released. If you mean when will it have support for RTX, keep in mind iRay is not a major profit center for Nvidia. They have a render engine basically to check that box. If they really cared about it they would have been supporting it and the other gpu accelerated render engines prior to launch. But they don't so they didn't but it is also important to keep in mind if you haven't been following the gaming press that the actual launch of RTX enabled games has been a major bust.
"Except for the handful of people who refuse to use Windows 10"
Hmmm.
While the gap may continue to slowly close, as of 2018, the number of PC's running Windows 7 remains greater than that of those running Windows 10.
https://netmarketshare.com/operating-system-market-share.aspx?options=%7B%22filter%22%3A%7B%22%24and%22%3A%5B%7B%22deviceType%22%3A%7B%22%24in%22%3A%5B%22Desktop%2Flaptop%22%5D%7D%7D%5D%7D%2C%22dateLabel%22%3A%22Trend%22%2C%22attributes%22%3A%22share%22%2C%22group%22%3A%22platformVersion%22%2C%22sort%22%3A%7B%22share%22%3A-1%7D%2C%22id%22%3A%22platformsDesktopVersions%22%2C%22dateInterval%22%3A%22Monthly%22%2C%22dateStart%22%3A%222017-12%22%2C%22dateEnd%22%3A%222018-11%22%2C%22segments%22%3A%22-1000%22%7D
(Don't worry, its a safe link)
Given that Windows 10 was released in July, 2015, and that it was offered as a free upgrade to Windows 7 users, coupled with the fact that the end of extended support for Windows 7 ends in about one year, the market share that the latter holds is quite remarkable and significant. Of course any estimate of OS use is always based upon surrogate data and therefore prone to assumption, but regardless of the absolute accuracy of the numbers, which could go either way, I would hardly call that a "handful" of Windows 7 users.
...not giving up my W7 Pro for an OS tht is offered as "a service".