Wednesday, November 28, 2007

PANIC MODE! WOO HOO!

I'm actually in quite a good mood despite the fact that I have about 2 weeks to finish thesis when I really should have another year. What can I say, I attempted a really really ambitious project. Renders are coming along fine, I've rendered out 3 shots, and 2 more are going overnight. I'm aiming for getting 4 shots finalized per day (the 3D work), which will leave me about 4 days for compositing. Not enough, of course, but it should be adequate to get something passable together.

I'm going through a marathon of effects: the splash, the fire, and the underwater shot all have new renders. The mrBatchAnim.mel script is saving my ass all over the place, and I've gotten pretty good at dealing with memory heavy renders. One of the best tricks, besides using the command line whenever possible, is to end the explorer.exe process while it's rendering. I find for some reason that will prevent most of the following:

-fatal error, cannot allocate 23493409583094 bytes of memory
-mental ray memory allocation error, memory exception thrown
-memory usage has exceeded allocated amount, memory exception thrown
-the C++ runtime library has caused Maya to terminate in an unconventional way
-memory allocation error, mental ray may have become unstable, please restart Maya.

I've been working at home a lot, now. Gone are the days where I can come home and relax. Now I finish my day's work and come home to start my night's work. I render overnight at the labs, during my day at the labs, at night at home while I sleep, and during the morning just before I leave the house. That's actually more than enough rendering power.

The really really big problem is that ALL THE FRICKIN MONITORS ARE DIFFERENT. My Macbook at home is the absolute worst--any render that looks perfect there looks extremely dark on any other monitor. My old Fujitsu lifebook has no graphics card as far as I know, so the color depth is about 20 colors. Or that's what it seems like. A nice-looking render on my Mac, when viewed on the lifebook, you can count the colors on the screen. Three shades of brown! Two shades of green! One shade of blue! And black!

I am seriously oscillating back and forth between feeling like I have more than enough time to finish and feeling like there's no hope for passing. These mood swing correlates very strongly with an effects shot looking decent and an effects shot looking like crap.

Some problems I solved this week so far: translucency for the dragon, putting the moon in the sky without a weird alpha fringe, getting a workable dragon splash effect, getting a first composite for the fire, and doing a first attempt at compositing in the ambient occlusion and depth passes. The depth pass is most troublesome because the depth fog tends to make everything flat. My plan right now is to make 4 AFX comps, one for beauty, one for depth, one for ambient occlusion, and one for additional special effects. I'll take all those comps and layer them on top of each other. Just to simplify things, hopefully.

I like talking to Adam Burr about my project. He always seems so optimistic about a project without sugar coating things. He tells me what's working and what looks bad, and yet I don't come away feeling like I just got beat up. Being in that dynamics class is kind of intimidating because many of students in the class right after me are absolutely phenomenal artistically and technically. I think CADA has peaked with that class.

Keeping track of renders for 34 shots is not fun. My system of swapping files into folders called currentBeauty04, currentAmbOcc06, currentDepth23, etc., is working pretty nicely. It's still a lot of image sequences, however, and sometimes I can't find an image sequence that I'm sure I rendered sometime somewhere.

Today was decent, in terms of getting work done. I really have to be a machine from now to the end of the semester. Tomorrow, I'll get four more shots rendered, work on the ripple effects, start working with Realflow one way or another, and do a second pass of the fire effect. If Conrad has my page-turning animation, that will work out well because those can be the four shots I render.

Okay, it's 1 am = time to do an effect then choose a shot to render.

Thursday, November 22, 2007

Final stretch

This is the final stretch! 20 days left to go, and I still feel only about 60% of the project is completed. To that end, I've outlined the tasks I need to complete every day in order to get this thing finished up to a passing level.

But first, a recap on some of the difficulties and problems solved over the past few weeks.

The nCloth book was actually quite a pain to deal with. In retrospect, I'm not sure if it was worth all the problems it has caused. The main difficulty with the simulation is that I am referencing the manual into animation files, and attempting to use the same simulation in multiple shots. This requires, first of all, that the cloth be cached using an absolute path, not a local one, so that it can change as the project changes. This was not too bad to deal with, as maya creates an nCache node to store this information.

Setting up the textures for the book pages was relatively easy using quick select sets and assigning textures to specific vertices. The rotation had to be mirrored (on the 2D placement node, UV scale set to -1) for some textures, but all in all not too much of a hassle. At this point, the files can be swapped out depending on what is needed for the shot. I'm only waiting for Billy's hand animation; Conrad Turner is helping me with this shot, as well as the first shot with the manual.

More annoying was the actual simulation itself. I still have not been able to solve the problem of the book jiggling when it's supposed to be still. My workaround, at this point, is to duplicate the object at a specific frame and then do a visibility switch when I don't want the book to jiggle around. For some reason, keying the nucleus from enabled to disabled and back again screws up, I think it might have something to do with the nucleus' interpretation of time.

The transform constraints work will to turn individual pages, but they were quite a pain when I realized I had created them incorrectly, and wanted to redo the simulation. Unfortunately, I had already smoothed the object, and this smooth was done downstream in relation to the actual cloth mesh.

The workaround for this problem made me feel for the first time that I actually have a good handle on how maya works. I knew that all the history for the nCloth simulation was all there, so I graphed the Hypergraph connections. nCloth has a very easy way of operating on meshes, it simply accepts an inMesh and passes an outMesh. Thus, all I had to do was bypass the smooth node and take the result of the nCloth and pipe it directly into shape node of the book geometry. This worked perfectly. I wrote a quick script so I could toggle between smooth-connected and smooth-disconnected versions of the nCloth. I was really glad that I didn't have to go back and set up the whole nCloth system from scratch. Yay for nCloth, it's really robust and quite easy to use and understand how it works. Here's the hypergraph images and the connection I had to change:




I did my title effect using maya fluids. I used the fluid paint tool to paint density into my container from a file, the "mighty dragon" text that I put in my original 3D animatic. I did two simulations, and reversed the first one to form the title out of vapor clouds. The second was kind of serendipitous, as I was experimenting with pushing some of the fluid attributes beyond their normal ranges (setting negative densities, buoyancies, etc.). The result was a really nice looking explosion where the title becomes rapidly dispersing licks of flame.

The past few weeks I have been trying to organize the animation of my shots. I have Vanessa Weller, Conrad Turner, Seon Crawford, and possibly Danielle Hopzafel to help with my animation, and coordinating has been a difficulty unto itself. I'm wondering if it would have been easier to just do the animation all by myself, since none of them are doing more than 4 shots per person. Some of the animation is not quite what I wanted either, not because it wasn't good (they're all great animators), but since they don't know my piece as well as I do some of the emotions and acting choices didn't seem quite right to me.

The drop effect I accomplished in half a day, modeled nearly completely after the bandaid drop effect I did for Gavin's technical directing class. The blendshapes worked nearly flawlessly and it looks relatively good for half a day's work. I will probably keep this effect as is, since at this point I don't have time to perfect everything.

The ripple effect for the spices is taking longer than it should. Well, no, it's taking about the amount of time I expected it to, but it's frustrating because the effect is just for one shot that doesn't really help the story that much. It's quite a successful effect, though, I've had a bunch of people comment it works out. It is the shot during the montage in which Billy drops a handful of thyme into the well. The leaves used an nCloth simulation modeled off of Duncan Brinsmead's confetti tutorial. In order to get the cloth to collide with the well surface, I connected the well's surface to the ground plane of the nucleus for the spices. This looked really flat, so I needed a way to slighly tilt the water plane to create the illusion that the spices were falling on water and not a solid ground plane. The ground plane direction is controlled with a normal vector, so I connected the normal vector's X and Z to the X and Z transform of a locator. I created a plane and aim-constrained the plane to the locator in order to quickly visualize the rotation motion of the ground plane, and now it works wonderfully. An animator could even move the locator as needed to quickly and easily jiggle the fallen spices around, depending on the amount of turbulence in the water.

To get the ripples, I actually rendered out the spices from the top view and created a blue transparency to represent the point at which the spices fell into the water. I took that top view render into after effects and created a ripple for each spice that fell in the water. At first, I thought I could get away with placing random ripples, but a render showed that it was obvious that the ripples were in the wrong place. The ripple render was then mapped to the wave height offset of the ocean shader, much like I did for the ripples from the single drop.

My underwater effect is coming along quite nicely. Gavin said he liked it accept for the bubbles, which were gray and lifeless. So I have since created lots more bubbles to give the illusion that it recently splashed into the water, and some kind of mystical bubbles that grow in size and swirl around the bottle. The major difficulty with this shot was matching the 2D fluid to the bottle, and preventing the bubbles from colliding with the bottle. The bubbles are particles, and particles collide with objects at their center point. This means that even though a particle collides with the bottle, nearly half the bubble penetrates the bottle before it detects the collision. To solve this, I created a non-renderable offset surface to use as the collision object. I also rendered out some ripples from an old ocean shader to represent caustics. Now the only elements missing are a depth pass for the bubbles and the bottle, and some kind of murkiness pass to further sell the fact that the shot is under the well surface. I might have little floaties in the murky pass, maybe use some live action footage.

I found this great ash effect while search for particle tutorials on line. The author didn't really give a tutorial, but instead the effect was used as an example of how the gravity and uniform fields were different (gravity, as in real life, does not take mass into account, while the uniform field does). In the effect, the author uses a moving black and white ramp to make a vase invisble while at the same time use the texture rate attribute of an emitter to emit particles (with randomized masses, and effected by a uniform field). The result is that it looks like the vase is disintegrating into dust particles that blow away quite nicely. I am trying to use it for the ash in my book, but the major problem is creating a nice looking ramp so that it doesn't just cheesily disintegrate from left to right. I might just stick to a top to bottom projected ramp, but this means I have to lay out the UVs of the manual perfectly.

I converted my paint effects trees to meshes. This is both a blessing and a curse. It's a blessing in that I can get my paint effects, depth, and beauty all in a single pass. all rendered in mental ray. The side effect, though, has been catastrophic in that it very often causes the batch renderer to crash. The mrBatchAnim script doesn't even work that well. Most of the time, I have to say a prayer and hope that the script doesn't create out of memory errors. Looking at the task manager, I can see Maya taking nearly 1.7 GBs of memory, whereas when rendering without the trees it rarely gets above 1 GB. This is a major problem, and to me it is the one that will make or break the project (if I can't render, I obviously can't produce a finished product. To that end, I'm going to give in to nearly everyone's suggestion and put some of the trees on textured cards. I've always hated the look of textured cards and have done everything in my power to avoid them, but if I can't render I'm going to have to give in. Textured cards. Blech.

I tried to tweak BSP settings. My tweak caused the render to go from 4 minutes per frame to over 30 minutes without anything rendering. I must not understand BSP settings as well as I thought I did. It's sad because that's something I think could save a lot of time, but how much time should I invest trying to get the settings correct?

The dragon is finally complete! For the most part, anyway. After a few problems with skin weights flying apart and files corrupting themselves, Adrian delivered a finalized dragon model. He's really cute! It has made the initial splash shot more difficult because it's hard to make the cute dragon look large, but I'm happy with the model. Adrian helped rig the mouth head and neck, since I was having a lot of problems having never done that kind of rigging before. I rigged the feet, claws, tail, and eyebrows. Besides a couple skin weight problems, I'm quite happy with the rig. Steve Matthews helped me sort out some of the skin weight issues, and now most of the deformations are at least acceptable. I spent about a day blocking the new dragon into the beauty shots based on the old dragon's animations, and it's looking quite nice. Cidalia did a wonderful texture for him, one that I am planning on deriving a translucency map from. Hopefully that will produce a cool effect for the splash shot where the dragon comes out of the well.

Dragon texture:



I bought two new effects books and have been scouring them for solutions to my effects problems. Eric Keller's book on special effects is kind of interesting because it is project based, and uses standard Maya features in unconventional ways. The ambient occlusion effect used to create a scanning effect for a skull was quite clever.

The second book was the standard maya Special effects handbook from Autodesk, for Maya 2008. The project files are Maya 2008 files, but with 8.5's new ignore version check box (under the Open scene option box), they worked fine. There are some really really cool effects in that book. It turned me on to the use of hardware particles as a viable solution. Previously, I had always thought of hardware particles as a nuisance. They look bad, and require an extra pass because points, streaks, spheres, etc. cannot be rendered in Maya software. Two things are required for awesome looking hardware effects, from what I've gleaned off of the special effects handbook. First of all is motion blur, which is easy to set in the hardware render settings. The second is the color accum checkbox on the particleShape attributes. This wonderful attribute causes the color of hardware particles to be additive, which is awesome for fire effects, explosions, and any kind of energy effect. After seeing some of the demonstrations from the book, and creating a couple of my own, I'm totally sold on using them as a separate pass for the splash effect. Many people I've talked to said that the splash can have some kind of a magical quality to it, since story-wise it does involve magic.

I should also talk about the disastrous industry critique. This is the second of the three critiques we have to go through, the first being a faculty critique that for me was very successful. I was quite surprised that the first critique went so well, and I expected the second critique to be much harsher. But not THAT much harsher. First of all, our monitors are calibrated incorrectly, so everything appears too bright. Thus, after color correcting, everything turns out too dark. This was especially a problem on the projector, for which I couldn't even show my project it was so dark. The industry panelists, one the head TD for lighting. He basically said I didn't have enough time to finish, and that I needed to really focus on getting the effects to look right. It was especially discouraging when he said it would probably take the rest of the semester to get one of my effects up to C- level, never mind the rest of the project. 5 weeks to do a spash effect? If anything, that's encouraging in that the time frame seems reasonable, unlike that Dreamworks presentation when the modeler said she churned out a NURBS noodle cart in half a day (something that would have taken a month for me).

Oh well, at this point it's just a matter of bringing everything up to passable level. I'm confident I can do that, but what I'm not confident in is the ability of the computers to render.

That's all for now! I've got another journal entry planned in about 10 days or so, so hopefully I'll be on schedule till then. It's gonna be a big push to the end of the semester. Pray for my renders!

Monday, November 5, 2007

Getting to final countdown

The end of thesis feels so far away, and yet, IT ISN'T! With just about a month to go, I feel like I have gotten about 50% of the way through my project. Time to step it up! Big changes over the past few weeks:

-starting the underwater bottle animation
-getting a nice looking ripple effect
-setting up organized rendering folders
-getting a nice looking plant-adding animation
-working on the manual
-working on a lighting rig
-working on paint effect trees

The underwater bottle is a nightmare of an effect, even Gavin said so. He showed me a similar effect that he did by texturing swirling liquid footage onto a deformed plane that moved along with his object. This seems more like an additional pass rather than the full on effect, but no matter what I do there will be several passes.

My first attempt was with a 3D fluid emitted from inside the bottle, but that didn't work because I needed impossibly high voxel resolution to get the fluid to realistically flow out of the bottle.

My second attempt was with a 2D fluid, with collision planes animated to follow the sides of the bottle. This turned out to be awful, and was even worse because I spent a lot of time on it, thinking with each passing hour that it would only take a little while longer to get it to work. As it turns out, matching collision planes to the 2D fluid is extremely difficult. Maya doesn't look at where a surface collids with the fluid plane, Maya checks to see if the object collides and then uses the whole object as a collision surface. Thus my first attempt at this method, which used a cylinder instead of two planes, failed miserably.

The breakthrough came when I decided to use the 2D fluid as it's own pass, and take care of the fluid in the actual bottle using a moving ramp shader. That part of the effect now looks good, and I'm working on shaping the 2D fluid, and getting in some particle bubbles. The bubbles are blobbies so I don't have to worry about reflections and refractions output from a hardware render.



The ripple effect was kind of fun to do. I started by working with the standard ripple effect from Maya's visor, which utilizes the pond wake. The only problem with this effect is the fact that it's not very portable at all. I can't really add anything on top of it because it's a fluid, and not a mesh.



My second attempt involved a keyed ramp on bump. With the ocean shader, this didn't work at all because for some reason the ocean shader has trouble reading in bump. Not sure why. It did work, however, when I mapped this bump to the wave height offset instead. Now it looks pretty good, if I do say so myself. The next component I need to add to this effect is the drop of fluid actually spreadin out in the pond, then sort of fizzing away to nothing.



Another little irritation that happened for both these effects. Apparently, mental ray does not like heavy particle or keyframed shader animation. It refused to render some of my particles and my shader animation for the bump map. After many headaches, I finally came across a script to batch render multiple frames in mental ray from the image viewer, something I had been trying to write myself but without success.

global proc mrBatchAnim ( int $start, int $end, int $by) {
int $frameNum;
$last = ($end + 1);
print ("\n\nRendering Frames " + $start + " to " + $end + "\n\n");
for ($frameNum = $start; $frameNum < $last;) {
print ("\nRendering Frame: " + $frameNum + "... \n");
currentTime $frameNum;
string $filename = ("C:\Documents and Settings\patsiarisd\Desktop\jking\wellTest2."+$frameNum+"tif");
renderWindowRender redoPreviousRender renderWindowPanel1;
//renderWindowSaveImageCallback renderWindowPanel1 $filename "image";
$frameNum = $frameNum + $by;
}
print ("\n\nRendering Complete.\n\n");
};

Of course, this is not how I got the script. The script I downloaded apparently only works for Maya 5 or so. Maya 8.5 has done away with the renderView object, or at least that was the line that failed: renderWindowRender redoPreviousRender renderView. So I changed it to the render panel and it worked, but unfortunately the panel refuses to accept the $filename input (the renderWindowSaveImageCallback line only saved the same image into the same folder, instead of the $filename folder). Thus, I can only save to the image directory specified by the current project. A minor annoyance, compared to the more aggravating problem of not rendering particles at all.

I've organized my render folders on a single computer in the lab, since gramercy, our networked storage space, is getting pretty full. Each shot has its own folder, which in turn contains folders for each pass. Older passes are transfered to other folders
within the same shot folders. I also changed the after effects file to locate the image sequences within these subfolders. I think I will stick to after effects for the basic composite, maybe even the edit, just for simplicity's sake.

The plant dropping animation was actually pretty fast for an effect. I basically tweaked Duncan Brinsmead's (yet again) tutorial for confetti. My knowledge of nCloth has since grown exponentially, methinks. This effect went quite fast, after a few tweaks with dynamic forces. I'm debating whether or not I should try to get some of the pieces to float on the water. I think I will add ripples for sure, but the floating on the surface dynamically is a little more difficult.



The manual requires a lot of nCloth tweaking, but the good news is I got it to model a pretty nice opened book. The page turning is turning out to be very problematic, however, and I'm considering doing the turning animation with blendshapes as I did for the animatic. Which is sad, since I set up the whole system in nCloth. But hey, whatever works.

Gavin discussed my lighting rig this week, and it's basically four lights: one coming from the street lamp (which has changed positions to better light Billy's face as he stands over the well, an opposing moonlight to serve as a colder fill, a bounce light only on Billy to simulate reflected light from the well, and finally a point light point constrained to the camera to fill in any other dark areas. I will then add more lights as needed on a per-shot basis.

The paint effects trees were also difficult to work with. The hardest part to deal with is their conversion into polgons, and I'm debating whether I should do this or not. The benefit is that if they are converted they can be rendered in mental ray, and so they can use the raytraced shadows created by the spherical area light. Otherwise I have to create depth map shadows for the trees, and probably render them out as a separate software pass. Here's one of my latest incarnations, the branches are a little too thick and the trees overall are too bright, I think. Gavin said they look too much like broccoli stalks.



Some more stills showing the latest look, sans paint effects trees because they don't render in mental ray as of yet:






Well, that's the update, for now. I'm starting to integrate some of the effects into my animated shots, which is a big step. Now my biggest worry is the dragon, which Adrian says will be finished by this week. I then have to rig and do blendshapes for the dragon's face, and animate it as well. That's the biggest hold up, causing me to basically ignore the second half of my animation. If the dragon is finished within this week, hopefully by next week I will have near finalized animation in all shots, and at least a first attempt at a beauty pass for every shot in the piece. Besides the dragon, I'm feeling pretty confident about the rest of the animation. But ask me again in a couple weeks, we'll see ...