DAZ via AMD

2»

Comments

  • cheefstickcheefstick Posts: 29

    What have I started lol.

    Thanks for the replies. I think I'm just going to shelve Daz and move on.

  • oddboboddbob Posts: 439

    cheefstick said:

    What have I started lol.

    Thanks for the replies. I think I'm just going to shelve Daz and move on.

    Think of all the fun you'd miss out on if you went to blender. You'd have to spend your time making stuff or rendering.

  • NylonGirlNylonGirl Posts: 2,209

    Gordig said:

    NylonGirl said:

    What we really need is an unsanctioned driver that makes the AMD card appear to be an nVidia graphics card.

    To what end? Your computer thinking that an AMD card is actually an NVidia card won't magically give your AMD card the proprietary technologies that make Iray acceleration possible.

    I don't think the things that accelerate IRAY are dependent on hardware. I think, for using IRAY, the most important things about the graphics card are the ability to quickly do all of the necessary mathematical calculations and to have enough video ram to hold all of the data to be processed. Any graphics card should be able to do that. I think the biggest obstacle is the software is programmed to refuse to work with some hardware. So we need the software to think the nVidia hardware is there.

    When I say "unsanctioned driver", I mean an actual driver that is a layer between the applications and the original graphics card driver. Not just something that spoofs the name of the graphics card. But something that takes the instructions IRAY calls for and sends them to the graphics hardware that is present.

  • TorquinoxTorquinox Posts: 4,260

    cheefstick said:

    What have I started lol.

    Thanks for the replies. I think I'm just going to shelve Daz and move on.

    FWIW, you didn't start it. The issues raised aren't new. It's unusual to see them exposed like this. And, of course, we have the benefit of @TheMysteryIsThePoint bringing high-level software development expertise to the party. I like what I see going on here. It's been a very good exchange. Thanks for bringing it up! It would be even better if good things happened in DS as a result. For now, keep in mind, DS is still a good way to export Daz content to whatever software you move to next.

  • TorquinoxTorquinox Posts: 4,260

    oddbob said:

    cheefstick said:

    What have I started lol.

    Thanks for the replies. I think I'm just going to shelve Daz and move on.

    Think of all the fun you'd miss out on if you went to blender. You'd have to spend your time making stuff or rendering.

    +10!  laugh

  • TorquinoxTorquinox Posts: 4,260

    NylonGirl said:

    I don't think the things that accelerate IRAY are dependent on hardware. I think, for using IRAY, the most important things about the graphics card are the ability to quickly do all of the necessary mathematical calculations and to have enough video ram to hold all of the data to be processed. Any graphics card should be able to do that. I think the biggest obstacle is the software is programmed to refuse to work with some hardware. So we need the software to think the nVidia hardware is there.

    When I say "unsanctioned driver", I mean an actual driver that is a layer between the applications and the original graphics card driver. Not just something that spoofs the name of the graphics card. But something that takes the instructions IRAY calls for and sends them to the graphics hardware that is present.

    If it could be done, it would open new possibilities. I can't say yay or nay to the actual technological requirements for running Iray. Even if you're correct (and you might be!), I suspect there are license limitations on the use of Iray that prevent the existence of a software rosetta stone like the one you propose. At least, AFAIK, no one has tried to do it.

  • WendyLuvsCatzWendyLuvsCatz Posts: 40,085
    edited March 30

    it's called a wrapper from what I can gather and is in effect what the CPU is doing if you don't have a Nvidia card, they just don't let you use other GPUs to do it

    so yeah licensing 

    it gets worse, the Nvidia cards themselves could do more but are limited according to model, I watched videos of people soldering in extra memory in lower end cards, someone on a video upgraded to 24GB VRAM on a 2080Ti doubling its memory, they were obviously very skilled and the labour involved outdid the cost of a more powerful card, it's a hobby for them

    those chips etc don't actually cost much to produce, it's the research and technology behind them you are paying for, setting up the production etc

    they could make many more but limit different models to create scarcity 

    another good example is DVDs, cost cents but whats on them is the product 

    software 

    Post edited by WendyLuvsCatz on
  • Richard HaseltineRichard Haseltine Posts: 108,072

    Torquinox said:

    NylonGirl said:

    I don't think the things that accelerate IRAY are dependent on hardware. I think, for using IRAY, the most important things about the graphics card are the ability to quickly do all of the necessary mathematical calculations and to have enough video ram to hold all of the data to be processed. Any graphics card should be able to do that. I think the biggest obstacle is the software is programmed to refuse to work with some hardware. So we need the software to think the nVidia hardware is there.

    When I say "unsanctioned driver", I mean an actual driver that is a layer between the applications and the original graphics card driver. Not just something that spoofs the name of the graphics card. But something that takes the instructions IRAY calls for and sends them to the graphics hardware that is present.

    If it could be done, it would open new possibilities. I can't say yay or nay to the actual technological requirements for running Iray. Even if you're correct (and you might be!), I suspect there are license limitations on the use of Iray that prevent the existence of a software rosetta stone like the one you propose. At least, AFAIK, no one has tried to do it.

    I strongly suspect that it is not, even if the basic idea were practical, that simple - if it was I would think we wouldn't get these compatibility issues with nVidia's own GPUs every time a new series came out.

  • Richard HaseltineRichard Haseltine Posts: 108,072

    TheMysteryIsThePoint said:

    Richard Haseltine said:

    TheMysteryIsThePoint said:

    Richard Haseltine said:

    TheMysteryIsThePoint said:

    Richard, you are presenting as fact something that is pure speculation. What you are saying could not be further from the actual facts of software development. Here are some facts:

    There is no where to lay this but at DAZ' feet. And it is 100% something that DAZ could have and should have anticipated because vendors updating their libraries to require a later ABI is absolutely commonplace. Library vendors have an interest in using the latest compiler, and sometimes the latest compiler requires a later ABI version in order to support, say, an important optimization. This is 100% normal, it happens all the time, continuously, and is to be expected, even. What is not normal is that an application developer, that knows this (or at least should know this), would make development decisions that disallow them from upgrading along with the vendor, creating a tremendous point of risk. If the vendor does not follow a major.minor.subminor versioning scheme (where versions with the same major version number are guaranteed to be binary compatible, i.e. what a major version number is supposed to tell its users), then they are not promising, and it should not be inferred, that there is any binary compatibility between those versions. NVidia did not do this, i.e. 2024.x is rather meaningless, and DAZ could/should have known how vulnerable they were.

    But even multi-million dollar projects make this same mistake from time to time; I've seen fortune 100 companies do it twice in my career, i.e. they allow the project to become dependent on a proprietary library (in both instances, it was a toolset called RogueWave) that is beyond their control, and then the vendor, for whatever reason, doesn't update their library. And so the entire project can't use the newer compiler that one important library requires (say, iRay) because another important library requires the old one, and you can't have it both ways in the same application.

    If DAZ is saying that it'll have to wait until DS5, they're saying that they've resolved that chicken-and-egg-like conundrum. Maybe Qt5/6 now provides the functionality that the offending library previously provided, but we can't know.

    There are ways around the problem, like segregating the iRay dependent code in another application compiled witht the new compiler and ABI, have it communicate with the rest of the app compiled with the old compiler and ABI via some Inter-Process Communication method like an Anonymous Pipes (if Windows has those), but that's quite a bit of work if that was not the original architecture.

    I am not entirely folowing - you seem to take exception to my aside saying that nVidia may have had to change the Iray plug-in due to circumstances, then seem to say the same yourself. My aside was just an "it may be just one of those things, not a deliberate choice" to try to avoid any finger pointing, and I don't see anything in your reply that actually conflicts with that. The comment on versioning (2024.0 to 2024.1) was purely me, and may or may not be beside the point.

    If I took exception it was because you failed to explicitly recognize that a library vendor upgrading their compiler, and that occasionally means their ABI, is not some Black Swan event, but rather the constant heartbeat of progress in Computer Science. It is to be expected and must be accounted for.

    And thinking that "2024.0 to 2024.1" may be "beside the point" is indicative. It is not beside, the point, it is the point. It is NVidia reserving the right to break the ABI at any time. A guarantee of stability would have taken the form of iRAY version x.y.z where as long as the major version is x, the ABI is guaranteed not to have changed.

    It is beside the point because it was me inserting a passing thought, not any kind of official Daz line. It is, however, somewhat relevant that nVidia did, manifestly, intend to support the 50x0 series in a version of iray that would work with DS. This is an abrupt change, not something announced in advance (as nVidia usaully do, for example when announcing that a line of GPUs is deprecated and will not be supported in future versions of Iray).

    Changing the ABI inconveniences users, and daz has tried to avoid doing it.

    No. A thousand times, no. Changing the ABI makes literally everything better. It's how software evolves to take advantage of all the advances in the evolution of Computer Science as a field. The apologetics that I objected to is that DAZ isn't trying to avoid doing it for the benefit of their users, it is that DAZ has created an ecosystem with policies that make it impossible to do it. This is not speculation. Exhibit #1: an SDK that exposes static header files to link directly against, guaranteeing that plugins break. As a plugin developer myself, this is awesome in certain ways (you just #include the headers, link the provided libraries and you're off to the races), but that requires the plugin developer to recompile his code against the new SDK whenever cerain changes are made to them that change the order of the symbols in them. That is why the SDK lags behind DAZ Studio development and plugin devs wonder why certain things are in the scripting but not the SDK: the scripting language, being an interpreted language, has a dynamic interface and so moving things around doesn't break anything. DAZ doesn't want to force plugin developers to have to recompile their code as some of them aren't able to do so. The solution on  Windows would have been a Component Object Model like Microsoft's COM, but that would have raised the bar of technical expertise necessary to write a plugin. I admit that it's kind of nice to just compile, link, and go.

    I will assume you joned too late to recall the days I mentioned earlier, when the SDK did get updated and people did have to reinstall plug-ins? Daz has learned the hard way that real end-users want stability and so they pursue a development strategy that delivers it. Remember, too, that plug-in developers are mostly single-person teams (or at least, single-coder).

    Richard, I'm going to say this as directly as I can, OK? Because instead of understanding the facts of software development, you keep defending objectively bad technical decisions. So, here we go: If DAZ wanted to give end users the stability they crave, the last thing they should have done was to publish header files in the SDK. Why? Because it guarantees that certain changes to the API will break the plugins that are compiled against them. If you peruse the SDK forum, you'll have noticed that there is no C++ API for some of the latest goodness in DAZ Studio. This is the reason why.

    The point I raised that you seem to be ignoring is that this poroblem has already been solved, DAZ just didn't utilize it. The technique is called Component Object Model, and Microsoft has had their flavor of it for decades. They call it COM. With it, client apps can query the component for the interfaces it wants to use, and build them dynamically at run time, much like DAZ Script does. Many other facilities, like DCE RPC, ONC RPC, CORBA, and Java Beans do this. DAZ Studio doesn't, and here we are.

    When I search for Conponent Object Model Qt Framework one of the first results, after correctig typos, is https://doc.qt.io/qt-6/object.html - which is precisely the method offered in nurmerous threads in the SDK forum for accessing features that have yet to make the SDK.

    We don't, and won't, what the initial design considerations a couple of decads ago, or a good decade ago for the move to DS4, were. It is quite possible that a different choice early on might have offered benefits, but I am sure it would also have had drawbacks - apropos this discussion creating binary plug-ins seems to be a minority sport for every application I have had experience with - but switching frameworks or architectures would hardly be a speedy process, and would probably freeze development of what we have while it was proceeding. 

    That doesn't mean they cannot make chnages to update the ABI, and indeed the next major version will (as did DS 3 and DS 4) but, as noted in my previous reply, such changes are a pain for end users and also for the small or single-person teams that develop most third-party add-ons (Daz always updates its plug-ins, even for minor versions, so clearly they will have no great trouble). Over the Daz Studio life-time this policy has been friendly towards users and add-on developers, rather than the reverse. Now we have a significant breakage (though I have to say that as yet the wait for 50x0 support seems to have been shorter than the wait for some previous generations); however, based on the limited evidence available to us it does look as if the timing was something of a surprise (again, nVidia did release a version of Iray that would have supported 50x0 architecture in 4.x.x.x so this situation developed in a very short period).

    Changing the ABI is not something that end users should ever even be aware of. If the manufacturer of the car you drive decides to change the fuel injectors, should you know or care? If it is a pain for end users, it is the result of poor technical choices, one of which I've described above.

    And do third-party components survive such updates? Do simpler products than cars, which allow end-uer maintenance, use custom parts that chnage with each model? Some do but I suspect they are not well-regarded as a result (think ink cartridges - and think how the public views the ink-catrridge manufacturers). Daz Studio needs to support, and maintain compatibility with in as far as I can, a very wide variety of content of various types - cars just need to be sticker-friendly.

    You are (perhaps) inadvertently further proving my point. Having "to support and maintain compatibility with in as far as I can, a very wide variety of content of various types" is pretty much how every book on Component Architecture begins, verbatim, by stating the problem that Component Architecture is meant to solve. If that was DAZ's aim, then they chose the absolute worst way to try to implement it that I can think of, the technique that would make that absolutely impossible to achieve: making public the header files that describe C++ code's static interfaces.

  • backgroundbackground Posts: 589
    edited March 30

    I would think, as time goes on, users will divide into two ( or more )camps,  the "I've got a 50xx series card and I absolutely need a version of Studio that let's  me render with IRAY on it, even at the expense of old plugins not working, and I want to buy new content for it"  and the second camp of "I've got a lot of old plugins that are critical to my workflow and I absolutely need a version of Studio that let's me render with IRAY on my pre 50xx series GPU, and I want to buy new content for it". I don't see a way to make both these groups of users happy without Studio splitting into two parallel lines of development.

    If DAZ offered two versions of Studio ( pre50xx, and 50xx plus) then each user could make a decision which one best meets their needs.

    I'm sure there would alo be a camp  with "I've got a 50xx series card and my old plugins are critical to my workflow", but t doesn't seem likely that these users can be accomodated.

    Post edited by background on
  • TorquinoxTorquinox Posts: 4,260
    edited March 30

    background said:

    I would think, as time goes on, users will divide into two ( or more )camps,  the "I've got a 50xx series card and I absolutely need a version of Studio that let's  me render with IRAY on it, even at the expense of old plugins not working, and I want to buy new content for it"  and the second camp of "I've got a lot of old plugins that are critical to my workflow and I absolutely need a version of Studio that let's me render with IRAY on my pre 50xx series GPU, and I want to buy new content for it". I don't see a way to make both these groups of users happy without Studio splitting into two parallel lines of development.

    With the obvious caveat that the pre-50xx people will eventually find themselves stuck for a lack of hardware. Quantities of cards are finite, lifespan of those cards is also finite. Technology is a treadmill. Eventually, we all fall off. At least we can see this one coming. It makes sense to start planning for that. The Poser crowd already went through the same thing.

    Post edited by Torquinox on
  • backgroundbackground Posts: 589
    edited March 30

    Torquinox said:

    background said:

    I would think, as time goes on, users will divide into two ( or more )camps,  the "I've got a 50xx series card and I absolutely need a version of Studio that let's  me render with IRAY on it, even at the expense of old plugins not working, and I want to buy new content for it"  and the second camp of "I've got a lot of old plugins that are critical to my workflow and I absolutely need a version of Studio that let's me render with IRAY on my pre 50xx series GPU, and I want to buy new content for it". I don't see a way to make both these groups of users happy without Studio splitting into two parallel lines of development.

    With the obvious caveat that the pre-50xx people will eventually find themselves stuck for a lack of hardware. Quantities of cards are finite, lifespan of those cards is also finite. Technology is a treadmill. Eventually, we all fall off. At least we can see this one coming. It makes sense to start planning for that. The Poser crowd already went through the same thing.

    Over time the benefits of legacy systems are overtaken by features of new systems, how this balances out depends on individual users . You will always get people who value legacy equipment highly and may be prepared to pay a premium for it. For example, if you want a legacy WW2 Tiger tank in running order, then you need deep pockets ( so far as I know the only one is at Bovington tank museum ).  

    Post edited by background on
  • TorquinoxTorquinox Posts: 4,260

    background said:

    Over time the benefits of legacy systems are overtaken by features of new systems, how this balances out depends on individual users . You will always get people who value legacy equipment highly and may be prepared to pay a premium for it. For example, if you want a legacy WW2 Tiger tank in running order, then you need deep pockets ( so far as I know the only one is at Bovington tank museum ).  

    That's not exactly an apples-to-apples comparison. I don't think the average Daz hobbyist is playing in that sort of playground. And this is not a game any of us asked to play. It's just the way of things as they turned out.

  • ArtRouladeArtRoulade Posts: 44

    Richard Haseltine said:

    TheMysteryIsThePoint said:

    Richard Haseltine said:

    TheMysteryIsThePoint said:

    Richard Haseltine said:

    TheMysteryIsThePoint said:

    Richard, you are presenting as fact something that is pure speculation. What you are saying could not be further from the actual facts of software development. Here are some facts:

    There is no where to lay this but at DAZ' feet. And it is 100% something that DAZ could have and should have anticipated because vendors updating their libraries to require a later ABI is absolutely commonplace. Library vendors have an interest in using the latest compiler, and sometimes the latest compiler requires a later ABI version in order to support, say, an important optimization. This is 100% normal, it happens all the time, continuously, and is to be expected, even. What is not normal is that an application developer, that knows this (or at least should know this), would make development decisions that disallow them from upgrading along with the vendor, creating a tremendous point of risk. If the vendor does not follow a major.minor.subminor versioning scheme (where versions with the same major version number are guaranteed to be binary compatible, i.e. what a major version number is supposed to tell its users), then they are not promising, and it should not be inferred, that there is any binary compatibility between those versions. NVidia did not do this, i.e. 2024.x is rather meaningless, and DAZ could/should have known how vulnerable they were.

    But even multi-million dollar projects make this same mistake from time to time; I've seen fortune 100 companies do it twice in my career, i.e. they allow the project to become dependent on a proprietary library (in both instances, it was a toolset called RogueWave) that is beyond their control, and then the vendor, for whatever reason, doesn't update their library. And so the entire project can't use the newer compiler that one important library requires (say, iRay) because another important library requires the old one, and you can't have it both ways in the same application.

    If DAZ is saying that it'll have to wait until DS5, they're saying that they've resolved that chicken-and-egg-like conundrum. Maybe Qt5/6 now provides the functionality that the offending library previously provided, but we can't know.

    There are ways around the problem, like segregating the iRay dependent code in another application compiled witht the new compiler and ABI, have it communicate with the rest of the app compiled with the old compiler and ABI via some Inter-Process Communication method like an Anonymous Pipes (if Windows has those), but that's quite a bit of work if that was not the original architecture.

    I am not entirely folowing - you seem to take exception to my aside saying that nVidia may have had to change the Iray plug-in due to circumstances, then seem to say the same yourself. My aside was just an "it may be just one of those things, not a deliberate choice" to try to avoid any finger pointing, and I don't see anything in your reply that actually conflicts with that. The comment on versioning (2024.0 to 2024.1) was purely me, and may or may not be beside the point.

    If I took exception it was because you failed to explicitly recognize that a library vendor upgrading their compiler, and that occasionally means their ABI, is not some Black Swan event, but rather the constant heartbeat of progress in Computer Science. It is to be expected and must be accounted for.

    And thinking that "2024.0 to 2024.1" may be "beside the point" is indicative. It is not beside, the point, it is the point. It is NVidia reserving the right to break the ABI at any time. A guarantee of stability would have taken the form of iRAY version x.y.z where as long as the major version is x, the ABI is guaranteed not to have changed.

    It is beside the point because it was me inserting a passing thought, not any kind of official Daz line. It is, however, somewhat relevant that nVidia did, manifestly, intend to support the 50x0 series in a version of iray that would work with DS. This is an abrupt change, not something announced in advance (as nVidia usaully do, for example when announcing that a line of GPUs is deprecated and will not be supported in future versions of Iray).

    Changing the ABI inconveniences users, and daz has tried to avoid doing it.

    No. A thousand times, no. Changing the ABI makes literally everything better. It's how software evolves to take advantage of all the advances in the evolution of Computer Science as a field. The apologetics that I objected to is that DAZ isn't trying to avoid doing it for the benefit of their users, it is that DAZ has created an ecosystem with policies that make it impossible to do it. This is not speculation. Exhibit #1: an SDK that exposes static header files to link directly against, guaranteeing that plugins break. As a plugin developer myself, this is awesome in certain ways (you just #include the headers, link the provided libraries and you're off to the races), but that requires the plugin developer to recompile his code against the new SDK whenever cerain changes are made to them that change the order of the symbols in them. That is why the SDK lags behind DAZ Studio development and plugin devs wonder why certain things are in the scripting but not the SDK: the scripting language, being an interpreted language, has a dynamic interface and so moving things around doesn't break anything. DAZ doesn't want to force plugin developers to have to recompile their code as some of them aren't able to do so. The solution on  Windows would have been a Component Object Model like Microsoft's COM, but that would have raised the bar of technical expertise necessary to write a plugin. I admit that it's kind of nice to just compile, link, and go.

    I will assume you joned too late to recall the days I mentioned earlier, when the SDK did get updated and people did have to reinstall plug-ins? Daz has learned the hard way that real end-users want stability and so they pursue a development strategy that delivers it. Remember, too, that plug-in developers are mostly single-person teams (or at least, single-coder).

    Richard, I'm going to say this as directly as I can, OK? Because instead of understanding the facts of software development, you keep defending objectively bad technical decisions. So, here we go: If DAZ wanted to give end users the stability they crave, the last thing they should have done was to publish header files in the SDK. Why? Because it guarantees that certain changes to the API will break the plugins that are compiled against them. If you peruse the SDK forum, you'll have noticed that there is no C++ API for some of the latest goodness in DAZ Studio. This is the reason why.

    The point I raised that you seem to be ignoring is that this poroblem has already been solved, DAZ just didn't utilize it. The technique is called Component Object Model, and Microsoft has had their flavor of it for decades. They call it COM. With it, client apps can query the component for the interfaces it wants to use, and build them dynamically at run time, much like DAZ Script does. Many other facilities, like DCE RPC, ONC RPC, CORBA, and Java Beans do this. DAZ Studio doesn't, and here we are.

    When I search for Conponent Object Model Qt Framework one of the first results, after correctig typos, is https://doc.qt.io/qt-6/object.html - which is precisely the method offered in nurmerous threads in the SDK forum for accessing features that have yet to make the SDK.

    We don't, and won't, what the initial design considerations a couple of decads ago, or a good decade ago for the move to DS4, were. It is quite possible that a different choice early on might have offered benefits, but I am sure it would also have had drawbacks - apropos this discussion creating binary plug-ins seems to be a minority sport for every application I have had experience with - but switching frameworks or architectures would hardly be a speedy process, and would probably freeze development of what we have while it was proceeding. 

    That doesn't mean they cannot make chnages to update the ABI, and indeed the next major version will (as did DS 3 and DS 4) but, as noted in my previous reply, such changes are a pain for end users and also for the small or single-person teams that develop most third-party add-ons (Daz always updates its plug-ins, even for minor versions, so clearly they will have no great trouble). Over the Daz Studio life-time this policy has been friendly towards users and add-on developers, rather than the reverse. Now we have a significant breakage (though I have to say that as yet the wait for 50x0 support seems to have been shorter than the wait for some previous generations); however, based on the limited evidence available to us it does look as if the timing was something of a surprise (again, nVidia did release a version of Iray that would have supported 50x0 architecture in 4.x.x.x so this situation developed in a very short period).

    Changing the ABI is not something that end users should ever even be aware of. If the manufacturer of the car you drive decides to change the fuel injectors, should you know or care? If it is a pain for end users, it is the result of poor technical choices, one of which I've described above.

    And do third-party components survive such updates? Do simpler products than cars, which allow end-uer maintenance, use custom parts that chnage with each model? Some do but I suspect they are not well-regarded as a result (think ink cartridges - and think how the public views the ink-catrridge manufacturers). Daz Studio needs to support, and maintain compatibility with in as far as I can, a very wide variety of content of various types - cars just need to be sticker-friendly.

    You are (perhaps) inadvertently further proving my point. Having "to support and maintain compatibility with in as far as I can, a very wide variety of content of various types" is pretty much how every book on Component Architecture begins, verbatim, by stating the problem that Component Architecture is meant to solve. If that was DAZ's aim, then they chose the absolute worst way to try to implement it that I can think of, the technique that would make that absolutely impossible to achieve: making public the header files that describe C++ code's static interfaces.

    I have just seen that support for QT 5.15 will expire on 25th May 2025. Can we derive anything from this for DS5? Perhaps a release is closer than we think.

  • TorquinoxTorquinox Posts: 4,260
    edited March 30

    Fixed! cool

    Post edited by Torquinox on
  • wsterdanwsterdan Posts: 3,061

    ArtRoulade said:

    I have just seen that support for QT 5.15 will expire on 25th May 2025. Can we derive anything from this for DS5? Perhaps a release is closer than we think.

    Can we derive anything from this for the next major release of DAZ Studio? I coud be wrong but I *think* DAZ Studio is currently using Qt 4.8.7, which hasn't been supported since 2015. I *wish* we'd been using Qt 5.x... 

  • Richard HaseltineRichard Haseltine Posts: 108,072
    edited March 31

    ArtRoulade said:

    Richard Haseltine said:

    TheMysteryIsThePoint said:

    Richard Haseltine said:

    TheMysteryIsThePoint said:

    Richard Haseltine said:

    TheMysteryIsThePoint said:

    Richard, you are presenting as fact something that is pure speculation. What you are saying could not be further from the actual facts of software development. Here are some facts:

    There is no where to lay this but at DAZ' feet. And it is 100% something that DAZ could have and should have anticipated because vendors updating their libraries to require a later ABI is absolutely commonplace. Library vendors have an interest in using the latest compiler, and sometimes the latest compiler requires a later ABI version in order to support, say, an important optimization. This is 100% normal, it happens all the time, continuously, and is to be expected, even. What is not normal is that an application developer, that knows this (or at least should know this), would make development decisions that disallow them from upgrading along with the vendor, creating a tremendous point of risk. If the vendor does not follow a major.minor.subminor versioning scheme (where versions with the same major version number are guaranteed to be binary compatible, i.e. what a major version number is supposed to tell its users), then they are not promising, and it should not be inferred, that there is any binary compatibility between those versions. NVidia did not do this, i.e. 2024.x is rather meaningless, and DAZ could/should have known how vulnerable they were.

    But even multi-million dollar projects make this same mistake from time to time; I've seen fortune 100 companies do it twice in my career, i.e. they allow the project to become dependent on a proprietary library (in both instances, it was a toolset called RogueWave) that is beyond their control, and then the vendor, for whatever reason, doesn't update their library. And so the entire project can't use the newer compiler that one important library requires (say, iRay) because another important library requires the old one, and you can't have it both ways in the same application.

    If DAZ is saying that it'll have to wait until DS5, they're saying that they've resolved that chicken-and-egg-like conundrum. Maybe Qt5/6 now provides the functionality that the offending library previously provided, but we can't know.

    There are ways around the problem, like segregating the iRay dependent code in another application compiled witht the new compiler and ABI, have it communicate with the rest of the app compiled with the old compiler and ABI via some Inter-Process Communication method like an Anonymous Pipes (if Windows has those), but that's quite a bit of work if that was not the original architecture.

    I am not entirely folowing - you seem to take exception to my aside saying that nVidia may have had to change the Iray plug-in due to circumstances, then seem to say the same yourself. My aside was just an "it may be just one of those things, not a deliberate choice" to try to avoid any finger pointing, and I don't see anything in your reply that actually conflicts with that. The comment on versioning (2024.0 to 2024.1) was purely me, and may or may not be beside the point.

    If I took exception it was because you failed to explicitly recognize that a library vendor upgrading their compiler, and that occasionally means their ABI, is not some Black Swan event, but rather the constant heartbeat of progress in Computer Science. It is to be expected and must be accounted for.

    And thinking that "2024.0 to 2024.1" may be "beside the point" is indicative. It is not beside, the point, it is the point. It is NVidia reserving the right to break the ABI at any time. A guarantee of stability would have taken the form of iRAY version x.y.z where as long as the major version is x, the ABI is guaranteed not to have changed.

    It is beside the point because it was me inserting a passing thought, not any kind of official Daz line. It is, however, somewhat relevant that nVidia did, manifestly, intend to support the 50x0 series in a version of iray that would work with DS. This is an abrupt change, not something announced in advance (as nVidia usaully do, for example when announcing that a line of GPUs is deprecated and will not be supported in future versions of Iray).

    Changing the ABI inconveniences users, and daz has tried to avoid doing it.

    No. A thousand times, no. Changing the ABI makes literally everything better. It's how software evolves to take advantage of all the advances in the evolution of Computer Science as a field. The apologetics that I objected to is that DAZ isn't trying to avoid doing it for the benefit of their users, it is that DAZ has created an ecosystem with policies that make it impossible to do it. This is not speculation. Exhibit #1: an SDK that exposes static header files to link directly against, guaranteeing that plugins break. As a plugin developer myself, this is awesome in certain ways (you just #include the headers, link the provided libraries and you're off to the races), but that requires the plugin developer to recompile his code against the new SDK whenever cerain changes are made to them that change the order of the symbols in them. That is why the SDK lags behind DAZ Studio development and plugin devs wonder why certain things are in the scripting but not the SDK: the scripting language, being an interpreted language, has a dynamic interface and so moving things around doesn't break anything. DAZ doesn't want to force plugin developers to have to recompile their code as some of them aren't able to do so. The solution on  Windows would have been a Component Object Model like Microsoft's COM, but that would have raised the bar of technical expertise necessary to write a plugin. I admit that it's kind of nice to just compile, link, and go.

    I will assume you joned too late to recall the days I mentioned earlier, when the SDK did get updated and people did have to reinstall plug-ins? Daz has learned the hard way that real end-users want stability and so they pursue a development strategy that delivers it. Remember, too, that plug-in developers are mostly single-person teams (or at least, single-coder).

    Richard, I'm going to say this as directly as I can, OK? Because instead of understanding the facts of software development, you keep defending objectively bad technical decisions. So, here we go: If DAZ wanted to give end users the stability they crave, the last thing they should have done was to publish header files in the SDK. Why? Because it guarantees that certain changes to the API will break the plugins that are compiled against them. If you peruse the SDK forum, you'll have noticed that there is no C++ API for some of the latest goodness in DAZ Studio. This is the reason why.

    The point I raised that you seem to be ignoring is that this poroblem has already been solved, DAZ just didn't utilize it. The technique is called Component Object Model, and Microsoft has had their flavor of it for decades. They call it COM. With it, client apps can query the component for the interfaces it wants to use, and build them dynamically at run time, much like DAZ Script does. Many other facilities, like DCE RPC, ONC RPC, CORBA, and Java Beans do this. DAZ Studio doesn't, and here we are.

    When I search for Conponent Object Model Qt Framework one of the first results, after correctig typos, is https://doc.qt.io/qt-6/object.html - which is precisely the method offered in nurmerous threads in the SDK forum for accessing features that have yet to make the SDK.

    We don't, and won't, what the initial design considerations a couple of decads ago, or a good decade ago for the move to DS4, were. It is quite possible that a different choice early on might have offered benefits, but I am sure it would also have had drawbacks - apropos this discussion creating binary plug-ins seems to be a minority sport for every application I have had experience with - but switching frameworks or architectures would hardly be a speedy process, and would probably freeze development of what we have while it was proceeding. 

    That doesn't mean they cannot make chnages to update the ABI, and indeed the next major version will (as did DS 3 and DS 4) but, as noted in my previous reply, such changes are a pain for end users and also for the small or single-person teams that develop most third-party add-ons (Daz always updates its plug-ins, even for minor versions, so clearly they will have no great trouble). Over the Daz Studio life-time this policy has been friendly towards users and add-on developers, rather than the reverse. Now we have a significant breakage (though I have to say that as yet the wait for 50x0 support seems to have been shorter than the wait for some previous generations); however, based on the limited evidence available to us it does look as if the timing was something of a surprise (again, nVidia did release a version of Iray that would have supported 50x0 architecture in 4.x.x.x so this situation developed in a very short period).

    Changing the ABI is not something that end users should ever even be aware of. If the manufacturer of the car you drive decides to change the fuel injectors, should you know or care? If it is a pain for end users, it is the result of poor technical choices, one of which I've described above.

    And do third-party components survive such updates? Do simpler products than cars, which allow end-uer maintenance, use custom parts that chnage with each model? Some do but I suspect they are not well-regarded as a result (think ink cartridges - and think how the public views the ink-catrridge manufacturers). Daz Studio needs to support, and maintain compatibility with in as far as I can, a very wide variety of content of various types - cars just need to be sticker-friendly.

    You are (perhaps) inadvertently further proving my point. Having "to support and maintain compatibility with in as far as I can, a very wide variety of content of various types" is pretty much how every book on Component Architecture begins, verbatim, by stating the problem that Component Architecture is meant to solve. If that was DAZ's aim, then they chose the absolute worst way to try to implement it that I can think of, the technique that would make that absolutely impossible to achieve: making public the header files that describe C++ code's static interfaces.

    I have just seen that support for QT 5.15 will expire on 25th May 2025. Can we derive anything from this for DS5? Perhaps a release is closer than we think.

    We know, as I recall, that the enxt major version of DS will use Qt 6 - DS 4.x.x.x uses Qt 3, as I recall 4.8.7 (run print( App.aboutQt() ); in the ScriptIDE or from a file to get a pop-up). So no, I don't think we can really infer anything from that. I am endeavouring not to have any expectations about release dates or forms (though I will be surprised if there isn't a Public Beta  period).

    Post edited by Richard Haseltine on
  • kyoto kidkyoto kid Posts: 41,857

    ...one of those here pushing "legacy" hardware beyiond its limits.  Still working on a system with a 6 core Bloomfiled Xeon, 24 GB of DDR2 memory, and a Maxwell Titan-X GPU.  Yeah an "ancient" machine by today's standards.  The newest component in it is the PSU which I installed 3 years (or 4) ago after the old one died,  I have an EVGA 12 GB 3060 which I picked up 4 years ago (with stimulus money) which is still in the box as the BIOS of my old X58 motherboard is too ancient to recognise it.

    This is not so much by choice as it is dictated by economics. I have a configuration (AM5 based) set up for upgrading to Win 11 standards  but the cost would pretty much take up most of monthly pension, so something like another "stimulus cheque" is needed.

    In the meantime the old hardware  keeps faithfully chugging along.  Yeah, there are some things I don't bother with, like animation, or use sparingly, like dForce (as it sometimes crashes the display driver requiring a full reboot to reset), but I get along for the time being.

  • Richard HaseltineRichard Haseltine Posts: 108,072
    edited March 31

    background said:

    I would think, as time goes on, users will divide into two ( or more )camps,  the "I've got a 50xx series card and I absolutely need a version of Studio that let's  me render with IRAY on it, even at the expense of old plugins not working, and I want to buy new content for it"  and the second camp of "I've got a lot of old plugins that are critical to my workflow and I absolutely need a version of Studio that let's me render with IRAY on my pre 50xx series GPU, and I want to buy new content for it". I don't see a way to make both these groups of users happy without Studio splitting into two parallel lines of development.

    If DAZ offered two versions of Studio ( pre50xx, and 50xx plus) then each user could make a decision which one best meets their needs.

    I'm sure there would alo be a camp  with "I've got a 50xx series card and my old plugins are critical to my workflow", but t doesn't seem likely that these users can be accomodated.

    DS 4 will continue to be available, and we can expect the next major version to read its {.duf/.dsf] files, so people should be able to use the older version where they need it and transfer to the newer. The reverse may also be possible, though of course some new featues may not be supported and there is no knowing how gracefully the older will cope with that [for example, see chnage log entries on URIs http://docs.daz3d.com/doku.php/public/software/dazstudio/4/change_log_4_20_0_17#4_20_0_2 http://docs.daz3d.com/doku.php/public/software/dazstudio/4/change_log_4_20_0_17#4_16_1_22 ].

    Edited to clarify a couple of points

    Post edited by Richard Haseltine on
  • f.rdni666f.rdni666 Posts: 0

    did you already tried using wine on linux, and install nvidia-libs ?

    i wanna try it, but unfortunately i don't have amd cards.

    i can run daz on linux using wine and that nvidia-libs but with nvidia cards

Sign In or Register to comment.