Bryce Lightning for dummies.

WhiteFoxWhiteFox Posts: 92
edited December 1969 in Bryce Discussion

I have looked everywhere I can think of to find out how to use Bryce Lightning, but all roads seem to lead to a cliff.
It seems as though there is no 'complete' tute on how to use Lightning that I can find. Most leave me hanging. Do I have to click render on the host, or on the slave? Will the render be saved when it is finished? And so many other questions I haven't found answers to. So having searched for several hours daily, and following link after link since I purchased Bryce (about 6-7 months ago) I have decided to ask for help.
Does anyone have any ideas, that aren't over technical (as I don't have a lot of 'Tech' smarts), for me to learn how to use Lightning?
Thanks in advance

Comments

  • HoroHoro Posts: 10,069
    edited December 1969

    @charliemcd2010 - I made a Lightning video but it's on the Mentoring DVD. To put it in a nutshell:

    Start Lightning on all machines you wish to participate in the render.

    If the main machine (the host) should also participate in the render, start Lightning also (Bryce 7.1, earlier versions used it automatically).

    Set the render priority as you wish, the host determines it for all clients. If you set priority to High, all clients will use all processor cores (up to 8).

    If you render a still image, enable Tile Optimization, for animations, the frame is already a tile.

    Hit the render button on the host.

    Once the render is finished, the rendered image is saved on the host computer.

    Important: each client gets the whole scene. make sure each one has enough memory to hold the scene.

    Lightning is quite stable. You can pull out the network cable on a client and put it back later. The tile lost will be rendered by another client and there won't be a hole in the image.

    If you have more questions, put them forward. When I find the time, I'll write up a tutorial.

  • WhiteFoxWhiteFox Posts: 92
    edited December 1969

    I can't thank you enough! That was exactly what I was looking for! I'll post my render in the DAZ gallery as soon as it finishes. Outer space here I come! Thank you again :cheese:

  • HoroHoro Posts: 10,069
    edited December 1969

    You're more than welcome. I'm glad you could make it work.

  • HoroHoro Posts: 10,069
    edited December 1969

    @charliemcd2010 - Telling you to hit the render button was wrong. You have set up an animation and run it from there, otherwise the scene is just rendered on your host machine without using Lightning.

    File > Render Animation
    Range to Render: Entire Duration
    Output Module: Format BMP Sequence
    File Location: set file name for the rendered image
    Enable Render on Network

    Configure
    Set network address
    Run a Search
    Select clients
    Update
    Use Tile Optimization
    Accept
    Accept

    Then the render starts. The progress is shown.
    You can click on Settings ... and see how each client is doing. Green it does nothing but is ready, orange it renders, red disconnected, black unknown.

    If all is done, the rendered image is shown in the default graphics program.

  • HoroHoro Posts: 10,069
    edited December 1969

    OK, I finally prepared a document how to install, set up and run Bryce Lightning: http://www.horo.ch/docs/mine/pdf/Lightning.pdf. There are also some troubleshooting tips and how much memory Bryce and the Lightning clients claim.

  • mindsongmindsong Posts: 1,693
    edited December 1969

    bravo sir, thanks for this brilliant and incredibly complete reference!

    cheers,
    mindsong

  • Ron HerrmannRon Herrmann Posts: 2
    edited December 1969

    Horo said:
    OK, I finally prepared a document how to install, set up and run Bryce Lightning: http://www.horo.ch/docs/mine/pdf/Lightning.pdf. There are also some troubleshooting tips and how much memory Bryce and the Lightning clients claim.

    I could swear that I once found a thread that cites the system requirements/limitations for Lightning when run under Mac OS.
    But I can't find it anywhere, so perhaps it was on another forum (God knows I've searched many.)
    You seem to know the most about the subject, and have been quite generous in your sharing of information.
    Can you perhaps tell me, if (under Mac OS) ALL machines (host & clients) must be running v.10.5.8?
    I didn't find anything on this in your recent PDF document post.
    Lightning isn't "seeing" other Macs on my LAN, and I'm not sure whether to trouble-shoot as a network issue, or a Bryce Lightning issue.
    Thank you in advance for any light(ning) you can shed on this (pun intended).

    Ron Herrmann

  • WhiteFoxWhiteFox Posts: 92
    edited December 1969

    I don't have a MAC so I really am not sure if this info will help. I know that Lighting can run on the same OS version that I've had Bryce 7.1 (i.e. Win 7/8/Vista/XP) so I would assume the same goes for MAC. Here's a link to the requirements https://helpdaz.zendesk.com/entries/22701634-Software-System-Requirements- I hope this helps

    -Charlie-

  • WhiteFoxWhiteFox Posts: 92
    edited March 2017

    Also you'll need to make sure all machines are 'viable' on the network and I believe network sharing needs to be turned on.

    -Charlie-

    Post edited by WhiteFox on
  • HoroHoro Posts: 10,069
    edited December 1969

    I had talked to folks on the Mac a few years back and they said the Mac is using IP and/or MAC (media access control) address. I doubt Lightning works on a newer Mac OS than 10.6 for the simple reason that Mac Bryce uses still code for the Motorola processors, not Intel as now, and Rosetta (to translate) is not included anymore.

  • mindsongmindsong Posts: 1,693
    edited January 2017

    I hate to necro threads, but this seems relevant to those who might still use the bryce lightning 7 tool, or experiment with it in the future.

    I've found that I cannot use the network rendering function for animations/image sequences unless I specifiy the "Entire Duration" in the range to render.

    Any 'Working Range" settings that I've tried (either by setting a real working range on the bryce timeline, or manually entering times/frames) results in what appears to be a legit render session that produces:

      - grey BMPs of the expected size

      - these renders take the expected time for both simple and/or complex scenes

      - the output filenames always start at =jobname=_0000.bmp and overwrite existing frames, even if the session is configured with a working range and the indicated output 'set' filename is correct. (e.g. =jobname=_0021.bmp will be shown as the starting target filename, but the resulting grey frames will actually start at =jobname=0000.bmp, overwriting possibly legit frames with grey frames. urg).

     - if the 'Report Render Time' option is set (in the document settings) stats show all zeros for the job stats (hits/rays/etc.), and the finished network render job shows 'done' and 0% complete.

    Again, all of the render nodes behave correctly (time, speed, size, outputs) with the 'Entire Project', but generate grey BMPs or AVI movies for any frame subset (e.g. frames 10 to 20).

    If there's a known trick or workaround for this issue, it would be a great place to mention it (and perhaps update Horo's great bryce-lightning document), otherwise I hope this saves someone some time/grief!

    Horo, I understand there was another constraint on lightning renders (doesn't honor some TA settings?). Can you (or anyone else) remind me (us) what other bug(s) you know of?

    cheers,

    --ms

    Post edited by mindsong on
  • mindsongmindsong Posts: 1,693

    Related question:

    In the bryce install directory, there's a 'Temp' folder that contains the per-job folders that each contain the consolidated results of the lightning network render runs before they are converted to the final outputs (BMP/AVI/QT/etc.). I'm wondering if anyone knows what format these files are.

    None of the file identification tools I've found on the wire have any hints, and there's no header info of note in the files (via hex editor, etc.) so I'm guessing it's probably a raw pixel dump. The reason I ask is that it might be viable to salvage or actually do sub-range lightning renders if these temp files can be grabbed and manually converted to the target format (BMP series in my case).

    Has anyone gone down this exploration path already? If not, a 2x2 pixel image sequence test series might make it painfully obvious what's going on...

    cheers,

    --ms

  • HoroHoro Posts: 10,069

    mindsong - there's no list what network rendering honours and what not. What I found out so far is when I did compare renders. Everybody stumbling over such a problem is very welcome to report.

    How the temorary file nartemp.tmp (network assisted render temporary) is encoded, I have no idea. It may be different whether you render a still or an animation. It is difficult to analyse because it is deleted once the render is finished. You can copy it while the render lasts but you will never have the complete file.

  • mindsongmindsong Posts: 1,693
    edited January 2017

    Thanks for the reply, Horo!

    What I'm referencing, is on the master Bryce network render computer, where the network render is started and managed, there is a ...\Bryce7\Temp\ folder.

    In that temp folder, there is a per-network-render-job folder named with an ugly windows temp foldername like "6255d402ee84470cb4aa080124e1ac" e.g.

       C:\Program files(x86)\DAZ 3D\Bryce7\Temp\6255d402ee84470cb4aa080124e1ac\

    In that folder, as the job progresses, all completed frames are collected as a single file per frame, named in order, in hex, to something like 000000f2.frm. (increasing/sequential). They are slightly larger than the BMP files that result from a successful run.

    At render session completion, those files are then converted to BMPs or the movie file format specified at render-job-startup time, and deleted once converted.

    Whether the job completes correctly or not, while any job is running (or paused!), these files are visible/available to the user. When the job completes, these 'frm' files are converted to the final request output (specified when creating the job), and deleted. As you know, this is not an unusual temp-file processing model.

    However, given the failure of the 'Working Range' renders (and perhaps any job failures), the per-job temp folders and frame files are all left behind (gigabytes in my case!).

    After all of that processing, I'd obviously like to covert these relic/orphan frames to BMP files and use them... but it occurs to me that it's also possible to use this 'trick' to actually render desired working ranges as a workaround to this apparent bug. These raw frames may have other uses as well (alpha information available?)

    To this end, the format of those frame files must be understood. Each frame is exactly the same size, so I don't think they are compressed. I'd guess it's 4 bytes per-pixel, and counts to (Width X Height) of the job dimensions. I can check this... I also think tools like ImageMagik (or a nice little Horo program!) can re-wrap such raw data into our favorite formats - lossless if done right.

    My reason for writing this detail is for future reference, and to see if anyone else has already worked this out...

    If not, as I pursue, I can detail my future findings here (and perhaps we can update your handy document as well!)

    cheers, and anyone out there who may know any more about this, please chime in!

    --ms

    Post edited by mindsong on
  • mindsongmindsong Posts: 1,693
    edited January 2017

    Update:

    The temporary network render per-frame files that are collected in the above-mentioned directories are constructed thus:

      - 56 bytes of header that doesn't change per image and can be ignored.

     - Width * Height pixels, 8 bit, ordered Blue, Green, Red, Alpha (Alpha always 128?)

    Should you have cause to convert gigabytes of network rendered 'Working Range' images in these folders, a simple script to read and skip 56 bytes, then loop through the rest of the file re-ordering the Bryce BGRA to RGBA until the bytes are exhausted would get you a raw headerless RGBA image file. You still need to convert this to a more standard file format, but it's easier to find RGBA to XXX (BMP. jpg, png, etc.) converters than writing your own. You need to have your width/height information to do that conversion, and each pixel is still 4 bytes.

    I was able to get the ImageMagik toolkit (win/mac/unix/... http://www.imagemagick.org/ ) to accomplish this and generate png, bmp, or jpg files via a two step process: separate the initial frame file into 4 separations, skipping the first 56 bytes (I used png files as tempfiles), then glue them back together in the correct order as the target file you prefer.

    Assuming the ImageMagik toolkit is installed and available at a command-line.

    Start with temp frame '0000001,frm' that is 1080 by 540 pixels WxH :

      convert -size 1080x540+56 -depth 8 rgba:00000001.frm -channel RGBA -separate separation.png

    Which creates the 4 separate image channel files, one per file, skipping the 56 header bytes. It invents the separation filenames based on the filename you used above with -0, -1, -2, -3 added to the base filename. You then re-order and recombine these files into your target file and are ready to use the file(s) via the: 

       convert separation-2.png separation-1.png separation-0.png separation-3.png -combine -alpha deactivate 0000001.BMP

    Which creates the 000001.BMP output file from the newly created separation files.

    ETA: single command to accomplish the same:

       convert -size 640x480+56 -depth 8 rgba:00000001.frm -channel RGBA -separate -swap 0,2 -combine -alpha deactivate your_new_image.BMP

    Just change the suffix of the target file, and that's the filetype you will get (cool magic, that!). I'd recommend a lossless type like BMP or png.

    I'd recommend doing all of this in a script that loops through your zillions of files, and would clean up the temporary separation files in each pass (delete just the pngs...) to keep things sane in your workspace.

    Again, the big picture goal here is - if you wish to network render a 'Working Range' or subset of your animation frames in a larger, or long-running image sequence, this process may be a bit of a hack, but it may make a project viable, or allow for a fix/re-render of a small (or large) subset of an animation sequence that would otherwise take too long to do the whole thing again (meeting deadlines becomes possible!).

    Obviously, rendering the sub-sequence without the network system is easiest and works fine, but may not be fast enough for certain jobs. You could even copy the original Bryce job-file over to a series of machines and manually use Bryce to render sub-sections on each (kind of what lightning is doing for you...), but you'd need Bryce licenses for each host and would have to manually define each job and re-collect the outputs.

    If for some reason you wanted to do this, e.g. to do more processing of the intermediate files, you could 'pause' any network render job and run the temp files that had been created to that point through this process.

    Hope this helps someone in a bind...

    cheers,

    --ms

     

    Post edited by mindsong on
  • HoroHoro Posts: 10,069

    Interesting findings. Since I'm not in animations, I could never have a look at this. You say the colour data is bgra, does this mean one pixel has the three colours plus alpha (interlaced) or are there 4 layers  (bbb ggg rrr aaa)? Striping the 56 byte header than makes it a raw file which can be opened my many programs but bgra has to be swapped to rgba. Or you can create a standard TIFF header with fixed tags (except width and height which have to be defined - or you always use the same size) either for the Mac or the PC. Then read the TIFF header and add the raw image date, then save it.

     

  • mindsongmindsong Posts: 1,693
    edited January 2017

    Hi Horo,

    good questions - I was trying to be clear, but there are a lot of facets to this data business. Oops!

    The native Bryce temp files add the 56 byte header and order the pixels BGRA. 

    The data is really simple:

       - 56-byte header (same each file)

    followed by:

       - "Height" rows of "Width" pixels, each 4 bytes, ordered: Blue,Green,Red,Alpha (BGRA)

    And no footer. The last pixel ends the file:

    =>56 bytes=<

    BGRA BGRA BGRA ... times the width

    ... height rows

    BGRA BGRA BGRA ... times the width

    end-of file...

    So the pixels seem to go by width first and down height rows, the way we read text in the west.

    Alpha has always been set to 128 (decimal) in the tests I've done.

    And yes, remove the header bytes (56), and reorder the pixel bytes, and you have a headerless raw RGBA blob of image data, but no Width/Height information, so adding a simple TIFF header (or BMP maybe) would make the file notably more portable.  

    For future readers, note that the use of the term 'raw' in our case is distinct and quite different from the same term when applied to cameras that generate 'raw' images. Our use is describing a series of 4 byte (RGBA) pixels that run Width first exactly Height times.Using google to find 'RAW' image information/converters is useless with the huge number of camera-based RAW formats, tools, and utilities. The following ImageMagik command can convert such a raw RGBA pixel series to a BMP file:

       convert -size 640x480 -depth 8    rgba:your_rgb_file.dat    bmp3:your_new_file.BMP

    (I prefer png files, but Bryce is using BMP files already, so it seems fair to stick with that for this example.)

    This doesn't seem to be an 'endian' byte order issue either, as the Alpha is last in both cases. I was hoping the ImageMagik toolkit had a simple 'swapbytes' sort of function as it streamed through the data, but I could not find any such option in the zillion options it has. Anyone who may know how to convert the raw bryce files with another tool, or more easily, please let us know. It would be a welcome bit of info. If I were going to 'fix' these files, I would probably pick the simplest header I could find and tack it on the data and let the various follow-on tools take it from there.

    I figure anyone who is still reading this thread will now have enough information to decide whether to re-render or to hack together a 'recovery' script for their project(s) that might run into the same bug. I also assume that if they understand what we're discussing, they can probably write a small program/script to solve their problem.

    The ImageMagik toolkit can be used to do all of the real work, per the two commands in the previous post - just replace the filenames 'with your own' .

    Again, thanks for keeping a helpful eye on this thread, Horo, and I hope this proves useful to someone in the future. Even with the research and write-ups, I've still saved many hours of (re)rendering time, and I've learned a bit...

    cheers,

    --ms

     

    Post edited by mindsong on
  • HoroHoro Posts: 10,069

    Thank you for the clarification. So all there is to do is reading width and height from the original file header, skip the rest to the start of the data and swap each first and third byte and leave the second and fourth in place, then create a standard file header and put the data after it.

  • mindsongmindsong Posts: 1,693

    I think that would do it, other than clarifying that the width and height info are not available from the Bryce temp file, but rather only available from the person who set up the network rendering job to start with (document settings selection, etc.).

    Skip the header (56), swap the 1st/3rd bytes for all 4 byte pixels, and add a standard (tif?) header that describes the image width/height/bytes/depth and you're good to go!

    good stuff!

    --ms

     

  • mindsongmindsong Posts: 1,693
    edited January 2017

    Ok. After another google search, it looks like this is a one-shot method with the imagemagik 'convert' command:

       convert -size 640x480+56 -depth 8 rgba:00000001.frm -channel RGBA -separate -swap 0,2 -combine -alpha deactivate bmp3:your_new_image.bmp

    where the original image (bryce tempfile) is '00000001.frm', 640x480 (w/h) and the output file is 'your_new_image.bmp', and the 'bmp3:' is a version specifier that matches Bryces version.

    ETA: added this tidbit to the earlier command series

    change the output file name to any suffix (png/jpg/tif/...) to generate that type of file.

    wow - too easy, and no interim temp-files to bother with!

    --ms

    Post edited by mindsong on
  • mindsongmindsong Posts: 1,693
    edited January 2017

    While this is 'fresh' in my mind, One other use of this 'recovery' trick is when a long network animation render (brycelightning7) has been started and you need to get some of the results 'now' (e.g. network render job progress verification and/or truncating a job for time's sake...), you can 'look' at or convert/extract the temp files at any time without waiting for the job to finish! Never cancel such a job or it will delete all of your progress - temp files and all.

    so to interrupt a job and collect/convert the output to that point, Bring up the Network Render Manager, select and 'Pause' the job, and then you can sneak in and grab/convert the temp-files as described above. Never 'cancel' a job you want the data from. And know that the temp files are deleted on jobs that complete normally. Working-Range jobs leave the relic files, but that's what got me going on this adventure, and I was happy they were there to recover!

    Note that because Bryce network rendering has 'run, pause, and cancel' as modes, you *cannot* interrupt a job halfway and get your results to that point! It's the whole job or nothing... As mentioned above, you can pause a job and play with the data to that point.

    Making use of the temp render files as described above can be useful - either to look at any samples of the progress, or to recover the progress to the interrupt point. Lastly, because the 'Working Range' doesn't work as promised (at least for me!), this is the only way I know of to do a sub-section animation render and collect the output using the lightning mechanism. Otherwise render it on a single (fast?) machine and you'll be fine.

    One last thought - I follow Horo's document recommendation and disable the 'Use Tile Optimization' setting (in the 'Network Render Settings' dialog) for my network animation sequence renders. That way each node does it's own complete frames.

    I should probably move on with my life now! Cheers!

    --ms

    Post edited by mindsong on
  • HoroHoro Posts: 10,069

    I did some tests. nartemp.bry is the Bryce file and it can be found in the Lightning folder on all participating clients. It is removed when Lightning is closed - or replaced by the next file rendered.

    In the temporary folder of the host, a new folder is created and in this folder, the frm-file (frame) can be found along with the file jobinfo (which was empty, 0 bytes in size for my still renders, it may contain information for an animation), and the file source.br7 which is the source file. If the host is also a client, the Bryce file exists twice temporarily.

    In my still tests, alpha was C0h, i.e. 192, not 127 (7Fh) but who cares for a still? The whole process is a bit lost for stills anyway. You have to watch the percentage done and pause just prior 100% to copy the frm-file away because once the job is done, the folder and all 3 files disappear.

    The header seems to have 14 4-byte values, 8 are 0 and 6 have some values. All are about 4 times bigger than for an image of half size (1/4 surface).

    The temporary file (.frm) can be converted with the free IrfanView:
    - File > Open As > RAW File, select file
    - Enter Img width, height and header size (56)
    - Select 32 BPP (4 bytes per pixel)
    - Under Options for 24 and 32 BPP select Color Order RGB (32 bit RGBA) and also Interleaved.
    - OK
    Now red and blue are still wrong.
    - Image > Swap Colors > RGB -> BGR
    - File > Save as ...

  • mindsongmindsong Posts: 1,693

    A nice variation! I like irfanview as well, but never used it to open raw images. That feature added to my toolkit!

    Did you see the header change at all between render frames? I did not, and even between distinctly different tests (size and content), it did not seem to change for my animations. The differing alpha is interesting.

    I edited my 'recipe' to 'disable' alpha and to force the bmp format to bmp v3, as one of my viewers wouldn't open the default imagemagick bmp format (v4?).

    I noticed that in the running job, that that temporary bryce (br7)  file was slightly different from the original jobfile and did not carry the document size when copied out and opened with a fresh Bryce instance. Not an issue, really, but interesting.

    Thanks for taking a look. Some of this stuff is handy. I'm actually poking a my long-running jobs and checking frames every-so-often now,  just because I'm curious and because ... i can. So it's kind of useful being able to keep an eye on things outside of Bryce's capabilities.

    To the original problem, I noticed that an animation render setup dialog will autofill the current scrubber frame. I always started at frame zero and did not notice this effect. I did not run a job with that setting to see if it works right/better as a 'working sub-range'. I have a long-animation running that prevents that test right now.

    good stuff!,

    --ms

Sign In or Register to comment.