ZDG random Daz Studio discoveries and questions.

1394042444581

Comments

  • ZarconDeeGrissomZarconDeeGrissom Posts: 5,412
    edited July 2016

    As for that AltShaders, I'm working on them, and considering some other angles as well.

    I was initially thinking of packing up a G3F set, and a V7 set. However I may just put those together into the same zip. It has to do with a clause in the ShareCG TOS regarding uploads of "Similar Items", if I understand that jargon properly. I am allowed to upload a bar stool that I make, however If I make a different style of bar stool, that second one may be considered "A similar item" and violate the TOS. My worry is, because I have already uploaded a WIP Alt shader for G3F, a second upload with G3F and V7 presets, may put me at odds with ShareCG.  I'm also very tempted to finish my Yulia Alt shader set, and upload that with my Paloma presets, however I'm sure that would be pushing my luck with that "similar item" clause. I could just toss it all into a single pack, however that would be a lot of excess stuff for people that don't have all the figures in the extended pack, so it's something to consider how best to precede.

    By the way, I'm not sure just how much blue LY intended Fallon to have in her skin tone. I know some PAs like to toss in light setups for there figures, I guess I should look to see if LY has such a set for Fallon.

    There is another shader that was sent to me I've yet to look at, that gives Paloma a slightly darker skin tone. I want to look at how that shader look on other figures with other maps, tho I have mixed feelings about sharing that shader preset.

    LyFallon_WIP02_001.png
    1200 x 640 - 592K
    Post edited by ZarconDeeGrissom on
  • FistyFisty Posts: 3,416

    There was also a misunderstanding possibly where some figures supposedly had Sub-D features and were not getting the 'HD' logo thing. So that left some doubts as to what was considered HD or not. (I can't locate the thread now, it was somewhere back in G2F days). I was looking, because there was at the time a nifty new "HD" logo on some figures, and I wanted to know what that was about.

    Ah crud..  forgot to add those to my latest promos..  thanks for inadvertantly reminding me.

  • ZarconDeeGrissomZarconDeeGrissom Posts: 5,412
    edited July 2016

    Glad to have, inadvertently helped, lol.   I took off yesterday to head to 'HD' to get a plug-in watt meter thing. Just to triple check some numbers, and my guesstimates was not all that far off on some stuff.

    There are absolutely stars and "Mitigating" factors for that one chart I made up really quickly. I had other stuff going on, and was not going to shut down everything going on just to do some Iray testing for a few days, I have work to do.  The watts listed is the TDP, not the measured watts that I've yet to do. The GT730 is driving my monitors so it has a handicap before even getting started. And that more impressive GT730+GT740 time still left my desktop unusable for doing other stuff while I wait for Iray to finish, the reason for looking into a second card for Iray. At least for the sub-one-hundred-watt cards that most can afford, running Iray on the card driving your monitors renders your computer useless for doing any kind of multitasking at all. I can't work like that, and as good as the Odroid is, there is a lot of stuff I have yet to figure out how to do on the thing, lol.

    So, the debate I've been pondering is this. Is it worth it to spend forty more dollars (USD) for a card that uses almost three times the watts, for only a ten minute improvement (about 30% faster then the GT730) in Iray render times, that is still excruciating for setting up scenes and stuff. Especially given it only has 4GB of VDDR5 to work with Iray scenes.

    Yea. About three times the watts of the GT730 for only 30% more performance. That's just pathetic.

    I think I'm with Luke over at LinusTechTips with this "Cheep" card. It lacks the horsepower to do gaming or Iray, and it blows your watt budget out of the water for things like CPU crunch boxes. The GT740 would cut the UPS Uptime of my WL2K box in half if it did not instantly trip out the inverter. I'm just not sold on the GT740, get the GT730 or save your money if you need more horsepower.

    I only got the card, because nobody had posted any Iray results for more affordable cards, and I know I'm not the only one looking. That, and it is something to get Iray off my display card for testing stuff. Next month, I'll get a bigger PSU and possibly a 120 watt 4GB GTX 960 to try out. After I get the time to run the test a few more times to be sure, I'll post the GT740 results over in SickleYield's Iray Benchmark thread.

    http://www.daz3d.com/forums/discussion/53771/iray-starter-scene-post-your-benchmarks/p10

    I can't guarantee that will happen today, it will happen tho. I need to get this waste of power out of my computer, it's making to much heat for the summer temps here.  I have confirmed at this point, that with Daz Studio open and not doing anything, the GT740 kicks into full throttle clocks and has increased the power draw of my workstation by about 40 watts. That's 40 watts just sitting there doing nothing and not even driving any monitors, it has got to go.

    If I could some how duck tape four GT730 cards together, it would get me a far better sub-one-hundred watt contraption then the GT740.  There is one other card out there in the sub-one-hundred watt range with Iray capable CUDA cores, however the GTX 950 fails to meat the minimum 4GB of VDDR (or more) requirement. So I guess that is about it then. If there is just no way you can afford a 200+ USD card or there is just no way to get more then a hundred watts to a new card, Iray just is not for you. There is just nothing else on the market that I can find at this time.

    20160702_IrayTest03_001crop1.png
    2860 x 1200 - 734K
    20160702_IrayTest02_001crop1.png
    2860 x 1200 - 498K
    IMG_7719hs1crop1.jpg
    800 x 160 - 107K
    IMG_7719hs1crop2w1024.jpg
    1024 x 660 - 523K
    GT730vsGT740_PrelimNumbers2_001.png
    560 x 280 - 6K
    Post edited by ZarconDeeGrissom on
  • ZarconDeeGrissomZarconDeeGrissom Posts: 5,412
    edited July 2016

    What a piece of work. I was looking at some other stuff while trying out the GT740, and I do have two good things to say about the GT740, tho it still belongs on the Wall Of Shame as far as I'm concerned.

    The second card doing Iray, at least lets me use my computer for other stuff while it's rendering. So that's one good thing, tho I'm sure any other card there would be just as good for that roll of getting Iray off the CPU and display card.

    The GPU is only getting up to about 50c with the room temp at around 75F (about 23.888c), so the build quality of the cooler appears to be sufficient. I have doubts about the openings at the front of the card where fan exhausts will be fighting the intake fans on the front of the computer, tho I guess it is neither here or there. It's kind of a shame, because EVGA did an incredible job on the card, the exhaust vent on the back slot is nice for rear exhaust computers, the component layout looks really good, and the PCIe Auxiliary power connector is very much appreciated for use on "More affordable" motherboards. The chip under it all is just a, well, The GK107 chip is just not worth it. It's a waste of money, waste of watts, and a waste of the silicon it's made from (in my honest opinion). nVidia could have done a hell of a lot better with that sixty watts, and should have. your better off getting two or three GT730 (GK208) cards for Iray then a single GT740, I am not impressed.

    I really hope the GTX 1060 has 8GB on it, or the GTX 1070 can be under clocked down under 100 watts.

    20160704_StrappyTest_001_Render 1.jpg
    1200 x 1200 - 418K
    IMG_7735crop1w1800.jpg
    1800 x 1080 - 955K
    Post edited by ZarconDeeGrissom on
  • ZarconDeeGrissomZarconDeeGrissom Posts: 5,412
    edited July 2016

    Alright, I'm just trying to think of a not so cynical way to put what I typed above. The GT740 and Iray has taken three days of my life I can not get back, that I had other stuff I needed to do other then waiting on my computer to do renders. Yea, it's bad.

    This render took around 24 minutes and 9.95 seconds to do. Lets not squabble over the render settings, it's just a base line for comparing render times and nothing else.

    Add a simple lace skirt, and the time shoots up to around 2 hours and 15.99 seconds, Maximum render time at 1650 iterations. Absolutely nothing was changed other then adding the skirt. That's honestly not much better at all then the 4GB GT730 (GK208) that uses a third of the power and is fanless (2 hours 15.4 seconds. Maximum render time at 1017 iterations). I honestly want to know what usage implementation was in mind when nVidia conjured up the GT740. It is completely gutless when it comes to performance, and it wastes to much power to be useful in any other application that dose not need a lot of GPU horse power.

    Now let me explain why I'm not comfortable exceeding that 100watt barrier. When I modded the case that is my workstation, I looked at all the ATX specs I could get my hands on. There is a few chapters that cover minimum air flow over different zones for cooling the innards of the computer. The old ATX standard is assuming a single 80mm intake fan at the front of the computer for cooling, and as you can see above that is not good at all for high watt components at all. It doesn't matter how good the heat sink is, if the case can't move the air fast enough, the components will simply not get the needed fresh air to keep there cool.

    At the time, there was nothing I could get for PCIe (Back in 2004/2005), so I only had the "AGP pro" and SSI HEB to work from. That and I do recording and mixing with this workstation, so noise is a major concern for me. The case can technically move enough air to do the job for What-hog cards, it will be no better then Server racks that require hearing protection to be around them. So a compromise dose need to be made.

    With a single 120mm fan cooling the CPU area, and a second 120mm fan cooling the cards, it's not that bad with noise, so long as the cards do not exceed 55 watts per slot.  It's down to how much air you can cram threw the 0.8 inch wide (20.32 mm) PCI card cross area without producing a jet engine sound.  Putting a 120 watt card in my computer would make it impossible for me to work with audio because of the airflow needed to cool it under load, it's that simple.

    20160704_PalomaIrayTest_01001_Render 5.jpg
    1200 x 1800 - 695K
    20160704_PalomaIrayTest_01001laceSkirt_Render 6.jpg
    1200 x 1800 - 898K
    ATX_TDPzones_001.png
    600 x 600 - 66K
    ATX12v_AgpPro_TDP_Spec_001.png
    540 x 480 - 55K
    ATX12v_TDPzones_007lbl2.png
    600 x 600 - 68K
    20160704_PalomaIrayTest_01001gt730_LaceSkirt1_Render 1.jpg
    1200 x 1800 - 906K
    Post edited by ZarconDeeGrissom on
  • Mustakettu85Mustakettu85 Posts: 2,933

    I wonder what's the difference between the "full" GT740 and my GT740M. The mobile version like mine is probably even less powerful. I'm playing with Iray now trying to replicate my 3Delight skin mats, and I switched to CPU. 

    But it runs my pre-2010 games really well.

  • ZarconDeeGrissomZarconDeeGrissom Posts: 5,412
    edited July 2016

    A better sampling of glass from the same wafer as the other GT740, that can run at a lower core Voltage. Lower core V at the same clock gives you lower watts. Possibly other gimmicks, tho those would depend on the vender rather then the glass (things like lower clocks, low-Voltage ram chips, etc).  As for games, My 43 watt NX8600GT (G84 chip from 2007/2008) had around twice the memory bandwidth of the GT730, tho everything else was the other way around (The GT730 was more cores, faster clock, etc of the NX8600GT midrange gaming card). So I'm not surprised that the GT740 and GT730 dose well with older game titles like UT04, DeltaForce, Starcraft, WoW, etc.

    So from the looks of it, the GT740 is the same thing as the GT730, from possibly a older generation then the GT730. I had read somewhere that the GT740 was a "rebadge of the GT650" or something, tho I've yet to figure out how to confirm or disprove that (It doesn't matter). In a way, it is kind of Ironic that cards that are labeled as not up to modern games today, are twice or more the 3D capability of the mid-level game cards of only a few years ago.

    I was not saying that the GT740 was incapable, it is 30% faster then the GT730. I was only saying that it is rather inefficient if not gluttonous compared to newer chips. The only major performance difference between the GT730 and the GT740, is around 100MHz faster CUDA cores on the GT740. I'll pull up a metrics chart, tho I'm sure that is the bulk of it.  Three times the power consumption for only a 100MHz faster clock just doesn't cut it for me, that's a blatant waste of power. Not even the P4EE was that bad with performance gain per watt over it's siblings, lol.

    Okeanos Explorer, is launching the rovers... LIVE (from around the Marianas Trench).

    (EDIT, no longer live.)

    front row seats, lol. (EDIT, dive over, there coming up now)

    chat latter y'all.

    Factoid. Fewer people have been to the bottom of the Marianas Trench, then have gone to the Moon.

    8600GT_Vs_Gt730_005topSelects_102_crop2.png
    1330 x 730 - 464K
    20160707_MorningDave2_001crop1.png
    800 x 480 - 321K
    20160707_MorningDave4_001crop1.png
    850 x 480 - 294K
    20160707_MorningDave5_001crop1.png
    850 x 480 - 331K
    20160707a1_MorningDave08_001crop1.png
    850 x 480 - 339K
    20160707a1_MorningDave09_001crop1.png
    850 x 480 - 353K
    20160707a1_MorningDave11_001crop1.png
    850 x 480 - 351K
    20160707a1_MorningDave12_001crop1.png
    850 x 480 - 395K
    Post edited by ZarconDeeGrissom on
  • ZarconDeeGrissomZarconDeeGrissom Posts: 5,412
    edited July 2016

    At about 5923 meters, at the Hadal wall. Looks good.

    you are here, lol.

    20160707b1_Cam1_01_001crop1.png
    850 x 480 - 301K
    20160707b1_Cam1_02_001crop1.png
    850 x 480 - 217K
    20160707b1_Cam2_01_001crop1.png
    850 x 480 - 85K
    20160707b1_Cam2_02_001crop1.png
    850 x 480 - 130K
    20160707b1_Cam1_03_001crop1.png
    850 x 480 - 426K
    20160707b1_Cam1feed1_01_001crop1.png
    850 x 480 - 451K
    20160707b1_Cam1_104_001crop1.png
    860 x 480 - 593K
    20160707b1_Cam1_107_001crop1.png
    860 x 480 - 781K
    20160707b1_Cam1_115_001crop1.png
    860 x 481 - 535K
    20160707b1_Cam1_118_001crop1.png
    860 x 480 - 459K
    20160707b1_Cam1_121_001crop1.png
    860 x 480 - 125K
    20160707b1_Cam1_122_001crop1.png
    860 x 480 - 707K
    20160707b1_Cam1_123_001crop1.png
    860 x 480 - 622K
    20160707b1_Cam1_127_001crop1.png
    860 x 480 - 94K
    20160707b1_Cam1_105_001crop2.png
    860 x 480 - 666K
    20160707b1_Cam1_150_001crop1.png
    860 x 480 - 84K
    20160707b1_Cam1_140_001crop1.png
    860 x 480 - 623K
    20160707b1_Cam1_151_001crop1.png
    860 x 480 - 147K
    Post edited by ZarconDeeGrissom on
  • ZarconDeeGrissomZarconDeeGrissom Posts: 5,412
    edited July 2016

    Well, I was a little disappointed when it was mentioned that the max depth of the rovers was 6,000 meters, I would have liked to see the subduction zone boundary a bit further down. It was still very good, lots of really good video from the teams.

    20160707b1_Cam1_200_001crop1.png
    860 x 480 - 138K
    20160707b1_Cam1_201_001crop1.png
    860 x 480 - 254K
    20160707b1_Cam1_204_001crop1.png
    860 x 480 - 428K
    20160707b1_Cam1_184_001crop1.png
    860 x 480 - 401K
    20160707b1_Cam1_196_001crop1.png
    860 x 480 - 430K
    20160707b1_Cam1_205_001crop1.png
    860 x 480 - 545K
    20160707b1_Cam1_208_001crop1.png
    860 x 480 - 772K
    20160707b1_Cam3_220_001crop1.png
    854 x 480 - 529K
    20160707b1_Cam3_224_001crop1.png
    854 x 480 - 482K
    Post edited by ZarconDeeGrissom on
  • ZarconDeeGrissomZarconDeeGrissom Posts: 5,412
    edited July 2016

    I was up rather late, so I missed the dive down to the site today. I have no idea how deep they are, or where they are today. I never got the memo, lol. I'm still working on my first cup of coffee and working on waking up.

    http://oceanexplorer.noaa.gov/okeanos/explorations/ex1605/welcome.html

    Perhaps that is why so many are so eager to repeat the atrocities of the past, like the decaying relics of a former age the memories of bygone generations fade into the mists of time. (Random thought as I was looking at the above image)

    Well, That drill they had made me jump, old USN habits, lol.

    Yes, they do drills at any time when there at sea, lol.

    (EDIT) OK, around 370 meters down, and looking at some B29 sites (Possibly a single one) around Sipan from WW2, thanks.

    I say possibly, as some stuff was missing/buried, and I'm not sure if it is or is not all from a single B29 (I don't know). It appeared "The Four Corners" were around the dive location. There was a fairly full main wing (missing one of 4 engines), part of the front fuselage tunnel (I'm not sure how much of fuselage or nose cone was buried in the sediment), and a large section of the tail. It looked like bits of where the rear gunner would sit in the tail of the plain was in pieces between the site of the tail and the site of the tunnel, along with many other very small pieces that may or may not have been part of the plain. Over all a good dive from the looks of the vid replay.

    20160708a1_Cam1_012_001crop1.png
    860 x 480 - 312K
    20160708a1_Cam1_017_001crop1.png
    860 x 480 - 706K
    20160708a1_Cam2_071_001crop1.png
    860 x 480 - 189K
    20160708a1_Cam1_035_001crop1.png
    860 x 480 - 387K
    Boeingb29superfortresss_LG_W700lbl2002.png
    700 x 455 - 332K
    Post edited by ZarconDeeGrissom on
  • ZarconDeeGrissomZarconDeeGrissom Posts: 5,412
    edited July 2016

    It would be nice if someone fixed the link for "Orders" in the store.

    It has been broken for the past few weeks, and just reloads the page that was open.  I have no idea where I left off on my invintory, because I can't pool up a list of my orders the past two months.

    OldOrderLinks_Broken_001lbl1.png
    540 x 92 - 3K
    Post edited by ZarconDeeGrissom on
  • KeryaKerya Posts: 10,943
    edited July 2016

    You can click on view all ...

    OrdersDAZ.jpg
    1363 x 103 - 16K
    Post edited by Kerya on
  • ZarconDeeGrissomZarconDeeGrissom Posts: 5,412
    edited July 2016

    Thanks. took a bit to remember how to get back to that "Recent orders" screen, lol. What a PITA. I got some stuff worked out, tho there is a few items I still need to trace down form months back. I ended up doing most of it from the "Product Library", and that still took a lot of time.

    As for that other expedition, I found out about it the second to last day, and I realy missed out on a lot of good stuff. It's all good tho. I thought they were going to start at 5920 meters down at the Hadal wall and work there way down from there, I didn't know that was just about there max depth, lol. Apparently there HD sonar can't even see down to the bottom of the trench, so my initial thoughts was not what the mission actually was going after, lol. None the less, they got some really stunning video from around there, like the above screen-cap from there forth dive, that I would like to see a more complete vid of the dive, perhaps some day that may be possible. I know, I know, I must have patience, they haven't even made it back to port yet with all the video that I totally missed out on, lol.

    Painted Carbonate Canyon_0621_Dive4.jpg
    1920 x 1090 - 562K
    Post edited by ZarconDeeGrissom on
  • Mustakettu85Mustakettu85 Posts: 2,933

    Thanks for these cool pics, Zarcon =) Science is going deeper and farther at the same time =) http://www.space.com/33375-juno-jupiter-probe-turns-on-science-instruments.html

    Don't know about StarCraft or WoW, but I currently run The Witcher (the first one) with a lot of extra eye candy, and I get pretty lovely FPS (40+). 

    Funny how online profilers tell me that for The Witcher 3 the weakest link of my system isn't the GPU but the CPU. Either way, it takes me literally years for a single playthrough of a decent RPG, so it's all okay. Despite having been a fan of the Witcher books for half my life, I only got into the games last year.

    Oh, and Ziggurat is a very recent release, and it works perfectly on "high" quality settings. But a non-game app, Substance Painter, takes a load of time to bake 4K maps (it uses the GPU). 

    Looks like GT740M embodies the "gamer GPU" trope =)

     

  • ZarconDeeGrissomZarconDeeGrissom Posts: 5,412
    edited July 2016

    Yea, it is a "End user" card not a Tesla, so the drivers are Optimized for ActiveX rather then for CUDA crunching. I also have doubts about OpenGL as well, that use to be a Pro card driver thing for some odd reason.  BTW, if you have PhysX enabled on the card it will eat up ram and some resources even if your not using it. If you do not have games that use it, or if your like me and only want a Iray crunch card, best to disable it... well, sort of disable it.

    I wish I had never installed that thing, as there dose not appear to be a way to make it just go away. All you can do is set it to CPU only to free up the GPU for more important stuff, lol.

    Yea, as for affordable cards that meet the Iray minimum requirements, there isn't much at all in the affordable range market. It is either a Watt-hog or sucks *** at Iray performance, or both.  I'm also not impressed with the new cards, it would take some significant under-volting to get them in the 55 watt per slot TDP budget of my systems (I'm not sure it is even possible to be honest). The power consumption of all the good performing cards have been complete and utter insanity the past nine years, and all the others are lobotomized and useless.

    Juno, well I was aware of the craft from the Armature-DSN community, however I did not find out about the "Live" stuff till after it was all over, lol.

    I missed the Orbit insertion Live vid by around an hour, lol.  Stereo and SOHO have been eating up most of my time, and then data from New Horizons and Messenger. There just is not enough hours in a day, lol.

    http://www.daz3d.com/forums/discussion/58833/ot-countdown-to-pluto-7-days/p5

    Rather dramatic opening, tho sounds about right for Jupiter.

    http://www.youtube.com/watch?v=glM9NjChOds

    Watching it now.

    Alright, I'll be back and working on mats and Alt shaders, After I evict this GT740 from my computer. The thing is burning threw about forty watts just sitting there doing nothing. The fan noise is quite annoying to put it lightly, the stock Odroid XU4 cooler is better then this thing to be honest (I can at least work with audio with the Odroid going). What a piece of work, I expected better then that nVidia.

    (EDIT 12Jul2016)

    I've given it some thought now, and dug deep into the sub-one-hundred watt lineup of Iray capable cards with hope and enthusiasm, and at this point I have decided what I'm going to do.

    The only cards that are anywhere near performance-per-watt balanced and Iray capable are only suitable for those looking to duplicate the power densities of a Z-pinch reactor.

    Given the inability to upgrade parts of a graphics card with inadequate memory without spending an entire governments GDP on outright replacement as the only option. I'd rather run 3delight on a Power8 threw a VM then waste my money on such unattractive sub-one-hundred watt POS cards. Is it honestly that impossible to produce an Iray capable GPU with 8GB of VDDR that runs on less then a hundred watts, the past nine years a is clear indication of just how 'Pathetic' things are.

    HPperWatt_GT7xx_003crop1.png
    700 x 380 - 48K
    PhysX_DisableSortOf_001lbl1001.png
    540 x 400 - 28K
    iw1dtu__juno 10jul16 b__ForRefOnly.jpg
    778 x 1010 - 499K
    zmachine.jpg
    960 x 305 - 91K
    Sandia_Z_Core_001.jpg
    1005 x 1500 - 876K
    Post edited by ZarconDeeGrissom on
  • ZarconDeeGrissomZarconDeeGrissom Posts: 5,412
    edited July 2016

    OK, Iray distraction over with, somewhat disappointingly, on to the Alt shader.

    Saving a preliminary is not all that difficult, the eyes and makeup will be far more tedious.

    Eyes require swapping out the maps, and selecting the specific zones to save for the "Material Preset". Thankfully the zoning on G3F is a tad easier to work with then G2F when it comes to switching maps for some stuff, and it's fewer checkboxes to click threw in the Mat save selection window.

    That checkbox for the Eye socket may have been a mistake I'll need to fix on G3F's alt shaders, tho I'm not sure yet. Some zones look better if they use the backing colors of the skin and others are better with no backing (like teeth and Irises). I'll need to look at that really quickly here. It's important for things like using V7's eyes on other figures that have drastically different skin settings (like FWSA Samira HD).

    OK, before I load and re-save the G3F ones, I'll need to save all them cool icons and Thumbs I made.  FYI, the icon is 91x91 pixels, and the 'tip' is 256x256 pixels.  I'm still thinking about what corner is best for the '3DL' text overlay on the icons.

    OK, that dealt with, I now have another oddity.

    This is the same exact eye alt shader settings I've used the past few months on many figures with no problems at all. Yet for some reason V7's eye 02 and 03 maps have that odd glowing ring around the Irises. It is not there on maps 01 or 04 or any of the G3F maps. Hmmm.

    Getting there. TBC...

    20160711a1_V7eyes_02001crop1.png
    1024 x 700 - 37K
    20160711a1_V7eyes_001crop1.png
    600 x 800 - 115K
    20160711a1_V7eyes_03001.png
    580 x 800 - 56K
    20160711a1_G3Feyes_01001.png
    680 x 259 - 83K
    20160711a1_V7eyes_04002.png
    461 x 716 - 13K
    20160711a1_V7eyes_05001crop1.png
    700 x 512 - 410K
    20160711a1_V7eyes_06001crop1.png
    512 x 400 - 156K
    Post edited by ZarconDeeGrissom on
  • FistyFisty Posts: 3,416

    Set the UV to Victoria 7 on the eyes for V7's mats, see if that helps, it might be >just< that much different from the default UVs.

  • ZarconDeeGrissomZarconDeeGrissom Posts: 5,412
    edited July 2016

    good point.

    It was worth a try, tho I see no change at all between base and V7 UV on the eyes. Other areas are far more obvious if the UV is wrong, lol.

    (edit) yea, those two maps (eye 02 and 03 maps) are just FUBAR, lol.

    20160711a1_V7eyes_07001crop1.png
    1024 x 800 - 383K
    20160711a1_V7eyes_08001crop1.png
    628 x 227 - 21K
    Post edited by ZarconDeeGrissom on
  • FistyFisty Posts: 3,416

    Darn.

  • ZarconDeeGrissomZarconDeeGrissom Posts: 5,412
    edited July 2016

    I did have the skin preset off on 'base' rather then V7, so good call there. I'm still not sure if it is worth it or not to re-save the eye color presets with V7 UV or not, I don't think there is any difference in th UV of those zones at all from the looks of it.

    I did notice some makeup mats in the V7 folder, so I'm working on those at the moment.

    I tried the makeup without the lips, and there was an ugly lipstick ring around the mouth so that idea was quickly canned. Makes life a little easier for me to not have to make so many presets and icons for them.

    I'm probably going to need to rename that folder now.

    errr, perhaps for both G3F and V7.

    20160711a1_V7eyes_09001crop1.png
    800 x 540 - 53K
    20160711a1_V7eyes_10001crop1.png
    680 x 630 - 236K
    20160711a1_V7eyes_11001crop1.png
    500 x 540 - 192K
    20160711a1_G3F_11001crop1.png
    500 x 540 - 165K
    Post edited by ZarconDeeGrissom on
  • ZarconDeeGrissomZarconDeeGrissom Posts: 5,412
    edited July 2016

    Morning, fell asleep at the keyboard as I was contemplating some mat icons. G3F dose not have maps for "there", tho V7 dose so I should just go ahead and include that for V7 any way. And I was looking at a different project.

    It is kind of pointless for me to do 4k vids, as my monitor is only a 1920x1200 display. And in the past I was working at 1600x1200 most of the time on the older G220fb displays (two of the three of them died, and I got the B243pwl to replace one of them). I'm contemplating weather I should stick to the old 1600x1200 vid format of my older stuff or not.

    Alt shaders, yes. I've decided to do up some G3M and M7 alts to make a basic starter pack. That's today's project, as it is to hot to fire up the old Audio gear racks at the moment.

    StellarDrift03008_prelim1ws1.png
    1924 x 1176 - 207K
    Post edited by ZarconDeeGrissom on
  • ZarconDeeGrissomZarconDeeGrissom Posts: 5,412
    edited July 2016

    Well, the weather was a tad worm the past few days, so not much got done here at the "Allen Pine tree moss and fern ranch", lol.

    I understand how some think that water should not be wasted on such luxuries as grass lawns, here at the "Allen Pine tree moss and fern ranch" we don't waste any water at all. The moss in the front yard is very nice and wonderfully green, and the fern field out back dose best in a shade without any sprinkler systems at all. In fact, we haven't even used the lawn mower for anything other then a yard decoration next to the recycle bin out by the street, lol

    I did manage to get a basic alt shader set up on G3M, however I've yet to save out any eye presets for him.

    IMG_7700hs.jpg
    1536 x 1152 - 2M
    IMG_7705hs.jpg
    1536 x 1152 - 2M
    Post edited by ZarconDeeGrissom on
  • ZarconDeeGrissomZarconDeeGrissom Posts: 5,412
    edited July 2016

    Just looking at something rather simple here, after seeing a few vids that had me think of it.  What is the GPU load from Daz Studio's interface in OpenGL mode (AKA, not Iray).

    So here, I was loading some stuff into a scene on the left of the graph, and was moving around some to select a few items to apply various mat settings. The heavier blip on the right of the graphs, I was spinning around G3F in the scene. Then I decided to set some of the numbers to 'max' for another look around.

    The massive blip in the center was me spinning around G3F in the scene, and the rest of the time I was trying out different mat settings. So, apparently there isn't much load at all on the GPU, unless you move around the scene in the view field. And the 4GB GT730 (GK208) apparently has plenty of room for a few more vids going at the same time while I'm working in Studio on stuff.  I am curious if it can be brought up to the clock of the GT740 to see if that is all it takes to get 30% better Iray times or not, lol. I'm just doing some digging around first to see if it has the headroom or not for 100MHz higher clock without it crossing the "Fusion ignition" threshold, lol.

    BTW, it's a bommy 79.5F in here right now, so rapid-render work is not at the top of my to-do list.

    20160714_OpenGL_SpinInScene_Load_001.png
    392 x 482 - 7K
    20160714_OpenGL_SpinInScene_Load_02001.png
    392 x 482 - 8K
    Post edited by ZarconDeeGrissom on
  • Mustakettu85Mustakettu85 Posts: 2,933

    Those ferns are awesome =) 

    You probably do have NVIDIA Inspector, but just in case you don't... http://www.guru3d.com/files-details/nvidia-inspector-download.html

    The easiest way to overclock and back. Not that I ever used it for this purpose, laptop and all, but the second app that comes with this package, the Profile Inspector, is what I use to enable AO, better AA and texture filtering on my older games. It works per-game, kinda like official NVIDIA Control Panel, but exposes way more settings.

     

  • ZarconDeeGrissomZarconDeeGrissom Posts: 5,412
    edited July 2016

    Thanks for the link Kettu. There was a tool that Jay was using on the GTX1080/1070 cards (Afterburner?) that I was not even sure would function on the GTX700 series cards.

    As soon as temps in here stabilize for my comfort, I'll look into that more. It has been sweltering the past few days, so nothing at all was done for the Gymnastics set of the Alt shaders. I ended up spending most of my time dealing with a breaker that kept popping with the AC going.

    I figured out what the culprit was after having leftovers this morning. Ptolema (1650 watt Microwave), the P4EE powered Bunn O'Matic VPR (coffee pot), and the AC unit was a tad much for the 20 amp circuit over that way. I can only use one of them at a time, and one of them must be unplugged if the AC is going. I say again, I simply do not have over a hundred watts to spare for an Iray crunch card on any outlet, lol.

    P.S. I do miss Pandora, exploding water within 45 seconds, lol.  Pandora was an 1800 watt Panasonic Microwave (The magnetron was 1800 watts  , she pulled more from the wall).

    GPUtools_001.png
    700 x 560 - 295K
    IMG_7761hs.jpg
    1536 x 1152 - 1016K
    Post edited by ZarconDeeGrissom on
  • FistyFisty Posts: 3,416

    Now that's my kinda breakfast right there!

  • Mustakettu85Mustakettu85 Posts: 2,933

    Here it's 28C which is 82.4F, according to google. But it's raining. And there's a thunderstorm. I sometimes wonder if Moscow is a tropical city at heart, only pretending to be situated in NorthEastern Europe LOL

  • ZarconDeeGrissomZarconDeeGrissom Posts: 5,412
    edited July 2016

    Hello, I was not intending to ignore everyone, it's just been nonstop here. I did get some numbers, and then something terrible happened.

    I have been relegated to a 900p LCD that cant even do a consistent gray at any angle. Lightning strikes do take there tole on power supplies (yes CRTs have a power supply of a sorts in them), tho the final straw was another idiot taking out a telephone pole.  I was in the middle of looking at idle power vs driver for the two cards, and then it happened. It's dead, the led is green, and not even the menu shows up on the  screen of the G220fb. I must have the worst luck, tho in contrast, I did enjoy incredibly consistent contrast for eleven years with the G220fb at phenomenal vertical real estate.

    That was literally the last screen-cap I took before the power went sporadic. I pulled out the GT730 to do tests of just the GT740, and before the benchmark ended, the power failed spectacularly. I never knew such colors could come from the LED light bulbs in the kitchen. When the power came back, the G220fb gave up the ghost on the BIOS boot screen.

    It's a sweltering 88F (31.1C) outside, I just put the G220fb out there to get it out from under my feet. At least the AC is keeping it around 80F (26.6C) in here today. I am beginning to barely get a mental understanding in C in human ranges, instead of places like inside Seth Lloyd's computer. As for weather, I've seen the thunderstorms from that ISS video feed. And there are a few vids around from lower altitudes, lol.

    Just don't ask about where that one came from, as I don't want to offend anyone over there.

    All I'll say is, I agree with all the posts that say that the 007 movie crews should get 'style' lessons from the man.

    Speaking of thunderstorms, got one overhead now, chat later y'all.

    IMG_7785hs.jpg
    1536 x 1152 - 835K
    UnigineHeaven_GT740a730_clocksOff_01001.png
    3520 x 1200 - 2M
    Moscow_Summer02_001.jpg
    960 x 540 - 242K
    Moscow_Summer04_001.jpg
    960 x 540 - 281K
    Post edited by ZarconDeeGrissom on
  • Mustakettu85Mustakettu85 Posts: 2,933

    Sorry this bad luck happened to you and your CRT. I wish I had air conditioning at home... I keep the windows open (as much as the grandmother would take, she's of the sort who's afraid of her own shadow), but the light in the evening draws huge moths in sometimes. I don't kill them, but catching them to let out is a chore. They're scary =(

    The centigrade scale is easy =) Ice forms at 0C, water boils at 100C. I just read this to try and understand where the Fahrenheit numbers come from... seems too arbitrary even after reading.

    https://en.wikipedia.org/wiki/Daniel_Gabriel_Fahrenheit#Fahrenheit_scale

     

  • FistyFisty Posts: 3,416

    Just keep in mind right around 20C is comfortable.

Sign In or Register to comment.