

Most stuff people work on is under some sort of NDA. Which leads us to the next big obstacle: the legal one. Good luck with that is all I like to say on this subject matter (and just thinking about the technical challenge here and ignoring the fact that your clients might not want to give you access to stuff like fancy custom shaders or plug-ins at all, for starters). Not counting textures etc.Īssuming you want to avoid that, you need to replicate the client’s pipeline on your site. If I expanded them to flat ones (which I did during developing some of the procedural DSOs), they’d use 20GB each on disk.

Try that with a heard of furry animals or a forest of pine trees …Īssuming you get you customers to only send you flattened data, this in term implies the size of the data will go up again. Otherwise the renderer will read everything at once. Essentially, you need to turn any procedural DSO call into (a) (possibly nested) delayed read archive(s). You can flatten any RIB (3Delight is great for this), but maybe you can’t render it anymore then. I’d even go as far and say that a place using a RMan compliant renderer considerably rises the probability that they’ll have some kind of rendering pipeline customization.Īnd this might include plug-ins to the renderer that link against libraries etc. And this more and more so even in smaller places. Most VFX shots depend on a complex rendering pipeline. One problem is, as playmesumch00ns pointed out, file size. How big are the files you typically render? 10 MB, 100MB, 1GB ? Mine generally aren’t that large, but that brings up another question I probably should have asked. That being said, there are still some pretty big files to upload and render. If it already has a copy of your file, it will use the one it has rather than upload a new one. Also, the program is smart enough to know if you are uploading the same file twice (e.g. First, the files will be compressed before being sent. I have several methods to improve upload speed. There’s also upload time, which may be what you were referring to. But instead of rendering on the spot, it uploads all these files for rendering - making things a lot easier on me.
#Renderman renderfarm archive
It looks for your shaders, texture images, and archive files. You execute it and it behaves somewhat like a renderer. So I’ve come up with a rendering application that sits on your computer just like any Renderman compliant renderer. Right, and I’ve been thinking about that a lot. I would imagine the problem wouldn’t be downloading the frames so much as uploading the data…
