Show Us Your Iray Renders. Part IV
This discussion has been closed.
Adding to Cart…
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.You currently have no notifications.
Licensing Agreement | Terms of Service | Privacy Policy | EULA
© 2025 Daz Productions Inc. All Rights Reserved.
Comments
that is too much now I will not sleep tonight anymore :p I would get heart attack before I put my arms up after seeing this visitor
What a fun image. The lighting is perfect for the scene and getting great reflections. Love it.
Thank you, Cath.
And yes, when it comes to books, people really do judge the book by the cover. If you want them to open it up, the cover has to say, "I'm the kind of book you're looking for." A hard lesson for Indie Authors to learn. But I'm getting there. lol
...yeah I have resorted to photo backdrops (which I usually put on a plane primitive) as most HDRI's don't produce clean enough shadows (or sometimes no cast shadows at all), and I have to do a lot of searching to find ones that have a decent sun elevation when I'm using scenery sets as skydomes don't work with the Iray Sun.
...this is really nice. Very "Bladerunner" like.
... wow, incredible effect. Had to pull this up on the workstation. as the notebook is getting old and it looked almost black. I could see this on a shelf in a bookstore.
...again had to view this in on a system with better calibration. Very scary. Nicely done.
... wow, incredible effect. Had to pull this up on the workstation. as the notebook is getting old and it looked almost black. I could see this on a shelf in a bookstore.
Thank you! Sounds like I may have targeted the right market with this.
Sorry about your notebook. I hope you get to upgrade soon. :)
...well, I use the notebook for all my online activity . Poor thing's almost 9 years old but still hanging in there. I don't take it anywhere with me anymore as the converter brick has developed a bad connection between it and the computer. The battery maybe holds a charge for about 5 min so I need to keep it constantly plugged in. It is so "legacy", hard to find replacement components for it anymore.
The only time I go online with the workstation to use the DIM, DL and install system updates, DL freebies or purchases from other sites, upload finished images, and like tonight, view an image to see it in more detail.
...this is really nice. Very "Bladerunner" like.Thank you Kyoto Kid
Having a break from filming...
Last render on pure CPU only, get my 4GB GTX 760 today :)
That is cool Mark, today I am expecting my 2 x Titan X 12 GB super-clocked and additional 16 GB Ram ..very exciting !!! giving my GTX 760 to a friend that is in need of better power
I don't know about you, but if I could just stop buying products for DAZ Studio, I could afford a new, more powerful, computer. Maybe an Nvidia card to go with it. But then, without DS, what would I need it for? lol
...with 6144 cores, you should be able to work in Iray view mode without much lag at all. Excellent when using HDRIs.
Hit all 6 numbers drawn in the Megabucks Lotto last night, However they weren't all on the same line (3 were on one, and three were on another) so I only won 8$ instead of 5.5$ million.
In the words of Maxwell Smart, "I missed it by That Much".
I don't know about you, but if I could just stop buying products for DAZ Studio, I could afford a new, more powerful, computer. Maybe an Nvidia card to go with it. But then, without DS, what would I need it for? lol
..since losing my steady job two years ago, haven't been purchasing much content at all.
I was fortunate to build my 64 bit workstation while I still had an income (just before being laid off), but it is now pretty much outdated. I cannot even upgrade the memory to 24 GB anymore as the kit I need is no longer available (DDR3 1333MHz). My MB only takes 1333, 1866 (overclocked) and 2000 (overclocked). All I can find now are 1600MHz 24 GB kits which are not compatible with a PT6 X58 MB.
Unless I can get a beefier GPU, I'm stuck with CPU renders going into swap mode a good portion of the time because they exceed the available memory which really bogs things down.
A quick google
http://www.amazon.co.uk/gp/product/B00864RAZK?ie=UTF8&tag=pric097-21&linkCode=xm2&camp=1634&creativeASIN=B00864RAZK
http://www.transintl.com/24gb-kit-3x8gb-ddr3-1333mhz-ecc-pc10600-dimm.html
...board won't support 8GB DIMMs. I need 6 x 4GB.
If it did then the maximum memory would be rated at 48GB and not 24.
Also I live in the States and the second one is for Mac. I have a PC.
Thank you. For those that are curious on the technical aspect, this is the lighting I used...
1. Bulbs in the lamps (2) set to emit.
2. The lamp shades (2) set to emit. They are set much lower than the bulbs, with the original lampshade texture applied. This gave a better effect than trying to get the bulb light to pass through the shades.
3. Spotlights for the zombie (3). One is directly above him to provide shadows. One is inside the room, just behind the couch, to shine up at him. The last is close to his face to give better mood lighting.
4. Spotlight for figure (1). Just off camera, with a wide arc, but low intensity, to temper some of the shadows behind the screaming figure.
Thanks. I have that problem as well as my images tend towards the dark area. I usually flip between two monitors to get an idea for the color range and brightness.
Very nice.
I'd like to hear about your experiences with that card vs mine. It seems to me that the 4gb fills up ridiculously fast some times...what brand are you getting? I think mine is a Gigabyte card.
Update to my webcomic, in glorious Irayvision: http://thefarshoals.webcomic.ws/comics/26
I dont get it anymore...
All the hype about supposedly faster renders if you have an NVIDIA graphics card...and i keep seeing renders done here with GTX cards that took hours to render!..
so that means that even in the unlikely event that i were to invest in an exspensive high end GTX card expecting to get faster renders...i will be disappointed!...
i think i need a good explanation about these GTX cards...are the renders really faster then with CPU...or is this just a myth?...because from what i can see so far...all the renders that were done with these cards have taken hours to complete...not minutes!...:roll:
Unbiased renderers (like Iray, Octane) always take longer to compute a scene because there are no cheats like a biased renderer has (3Delight). The same unbiased render done on a CPU will take longer than one on a GPU with many CUDA cores. Think of it this way... the many CUDA cores is similar to having a render farm on a card. The renders you mention taking hours are ones with more complex computations required to get the desired lighting results... usually more complex lighting. HDRI lit scenes render much more quickly. And there are more speed tricks users have been discovering.
Only if you bought a very old card and got totally shafted on the price. You seem to have missed the large number of posts about massively faster up renders with newer cards. My renders went from 10 minutes to 10 seconds for small simple scenes. Others went from 8 hours to 15 minutes.
I will point out that there are a number of simple portrait shots or otherwise uncluttered scenes that take me no more than 10, 15 minutes in Iray. It reeeeeally depends.
And if you ever experimented with a lot of Uberenvironment and Meshlights in 3Delight, you were seeing rather long render times, too.
The key is the GB of video RAM. If your card has 4+GB video RAM, chances are that the scene will fit entirely in the video card. If not, no GPU render, no matter how expensive the card is.
That said, if I have a sufficiently small scene, and render it on my 2GB GTX card, there's a really notable time difference between the CPU render and the GPU render. Alas, with 2GB and the card serving as monitor-Card, too, the moment I add a Genesis into my small scene, the 2GB will not hold the data, and it's back to CPU.
So the key point is more VRAM, which, unfortunately for us mere plebians, only the very high end cards offer.
You mean, like this?
http://www.amazon.com/Corsair-4x4GB-Desktop-Memory-CMX16GX3M4A1333C9/dp/B0054KPK9C
Basically, if those renders actually are using the card (ie the scene fits in the card's memory), then without the GPU acceleration, they would be taking 20 to 40 or so hours to complete...about the same as a non-accelerated Luxrender render of the same scene (Reality/Luxus export) or with the equivalent light (full GI) 3Delight render.
And yes, a 3 to 5x speed up is typical, with up to 10x or more for some scenes.
Basically, if those renders actually are using the card (ie the scene fits in the card's memory), then without the GPU acceleration, they would be taking 20 to 40 or so hours to complete...about the same as a non-accelerated Luxrender render of the same scene (Reality/Luxus export) or with the equivalent light (full GI) 3Delight render.
And yes, a 3 to 5x speed up is typical, with up to 10x or more for some scenes.Not just some scenes.
GPU render speeds are hardware dependent and they definitely scale, presuming that your scene doesn't render so fast the hardware you throw at it doesn't make a difference. Please be aware that the numbers presented below are estimates, and depend on many factors, like actual Motherboard in use, bandwidth available, RAM, speed of your CPU, etc. YMMV.
If you have a GT 740 with 4GB of RAM and your scene fits on the card, then your render time is likely to be 2-3x your cpu only speed, depending on your CPU (Note this is a low end card, costing about $100.). Now the same scene with 2x GT 740 cards will be 4-6x as fast as cpu only speed. 4x GT 740 cards (Presuming your Motherboard and power supply can handle 4 cards) and you are likely looking at 8-12x the CPU only speed.
The more hardware you throw at it, the faster it gets. Swap out the GT 740(s) for GTX 980(s), 980Ti(s) or TitanX(s) and watch things really start to fly. (Note each of those cards has more power than the four GT 740 cards combined)
Want to get really silly, or scary (depending on your point of view) throw one or more NVIDIA VCA at a render. (Power roughly equal to 8x TitanX per VCA)