Thursday, November 22, 2007

Final stretch

This is the final stretch! 20 days left to go, and I still feel only about 60% of the project is completed. To that end, I've outlined the tasks I need to complete every day in order to get this thing finished up to a passing level.

But first, a recap on some of the difficulties and problems solved over the past few weeks.

The nCloth book was actually quite a pain to deal with. In retrospect, I'm not sure if it was worth all the problems it has caused. The main difficulty with the simulation is that I am referencing the manual into animation files, and attempting to use the same simulation in multiple shots. This requires, first of all, that the cloth be cached using an absolute path, not a local one, so that it can change as the project changes. This was not too bad to deal with, as maya creates an nCache node to store this information.

Setting up the textures for the book pages was relatively easy using quick select sets and assigning textures to specific vertices. The rotation had to be mirrored (on the 2D placement node, UV scale set to -1) for some textures, but all in all not too much of a hassle. At this point, the files can be swapped out depending on what is needed for the shot. I'm only waiting for Billy's hand animation; Conrad Turner is helping me with this shot, as well as the first shot with the manual.

More annoying was the actual simulation itself. I still have not been able to solve the problem of the book jiggling when it's supposed to be still. My workaround, at this point, is to duplicate the object at a specific frame and then do a visibility switch when I don't want the book to jiggle around. For some reason, keying the nucleus from enabled to disabled and back again screws up, I think it might have something to do with the nucleus' interpretation of time.

The transform constraints work will to turn individual pages, but they were quite a pain when I realized I had created them incorrectly, and wanted to redo the simulation. Unfortunately, I had already smoothed the object, and this smooth was done downstream in relation to the actual cloth mesh.

The workaround for this problem made me feel for the first time that I actually have a good handle on how maya works. I knew that all the history for the nCloth simulation was all there, so I graphed the Hypergraph connections. nCloth has a very easy way of operating on meshes, it simply accepts an inMesh and passes an outMesh. Thus, all I had to do was bypass the smooth node and take the result of the nCloth and pipe it directly into shape node of the book geometry. This worked perfectly. I wrote a quick script so I could toggle between smooth-connected and smooth-disconnected versions of the nCloth. I was really glad that I didn't have to go back and set up the whole nCloth system from scratch. Yay for nCloth, it's really robust and quite easy to use and understand how it works. Here's the hypergraph images and the connection I had to change:




I did my title effect using maya fluids. I used the fluid paint tool to paint density into my container from a file, the "mighty dragon" text that I put in my original 3D animatic. I did two simulations, and reversed the first one to form the title out of vapor clouds. The second was kind of serendipitous, as I was experimenting with pushing some of the fluid attributes beyond their normal ranges (setting negative densities, buoyancies, etc.). The result was a really nice looking explosion where the title becomes rapidly dispersing licks of flame.

The past few weeks I have been trying to organize the animation of my shots. I have Vanessa Weller, Conrad Turner, Seon Crawford, and possibly Danielle Hopzafel to help with my animation, and coordinating has been a difficulty unto itself. I'm wondering if it would have been easier to just do the animation all by myself, since none of them are doing more than 4 shots per person. Some of the animation is not quite what I wanted either, not because it wasn't good (they're all great animators), but since they don't know my piece as well as I do some of the emotions and acting choices didn't seem quite right to me.

The drop effect I accomplished in half a day, modeled nearly completely after the bandaid drop effect I did for Gavin's technical directing class. The blendshapes worked nearly flawlessly and it looks relatively good for half a day's work. I will probably keep this effect as is, since at this point I don't have time to perfect everything.

The ripple effect for the spices is taking longer than it should. Well, no, it's taking about the amount of time I expected it to, but it's frustrating because the effect is just for one shot that doesn't really help the story that much. It's quite a successful effect, though, I've had a bunch of people comment it works out. It is the shot during the montage in which Billy drops a handful of thyme into the well. The leaves used an nCloth simulation modeled off of Duncan Brinsmead's confetti tutorial. In order to get the cloth to collide with the well surface, I connected the well's surface to the ground plane of the nucleus for the spices. This looked really flat, so I needed a way to slighly tilt the water plane to create the illusion that the spices were falling on water and not a solid ground plane. The ground plane direction is controlled with a normal vector, so I connected the normal vector's X and Z to the X and Z transform of a locator. I created a plane and aim-constrained the plane to the locator in order to quickly visualize the rotation motion of the ground plane, and now it works wonderfully. An animator could even move the locator as needed to quickly and easily jiggle the fallen spices around, depending on the amount of turbulence in the water.

To get the ripples, I actually rendered out the spices from the top view and created a blue transparency to represent the point at which the spices fell into the water. I took that top view render into after effects and created a ripple for each spice that fell in the water. At first, I thought I could get away with placing random ripples, but a render showed that it was obvious that the ripples were in the wrong place. The ripple render was then mapped to the wave height offset of the ocean shader, much like I did for the ripples from the single drop.

My underwater effect is coming along quite nicely. Gavin said he liked it accept for the bubbles, which were gray and lifeless. So I have since created lots more bubbles to give the illusion that it recently splashed into the water, and some kind of mystical bubbles that grow in size and swirl around the bottle. The major difficulty with this shot was matching the 2D fluid to the bottle, and preventing the bubbles from colliding with the bottle. The bubbles are particles, and particles collide with objects at their center point. This means that even though a particle collides with the bottle, nearly half the bubble penetrates the bottle before it detects the collision. To solve this, I created a non-renderable offset surface to use as the collision object. I also rendered out some ripples from an old ocean shader to represent caustics. Now the only elements missing are a depth pass for the bubbles and the bottle, and some kind of murkiness pass to further sell the fact that the shot is under the well surface. I might have little floaties in the murky pass, maybe use some live action footage.

I found this great ash effect while search for particle tutorials on line. The author didn't really give a tutorial, but instead the effect was used as an example of how the gravity and uniform fields were different (gravity, as in real life, does not take mass into account, while the uniform field does). In the effect, the author uses a moving black and white ramp to make a vase invisble while at the same time use the texture rate attribute of an emitter to emit particles (with randomized masses, and effected by a uniform field). The result is that it looks like the vase is disintegrating into dust particles that blow away quite nicely. I am trying to use it for the ash in my book, but the major problem is creating a nice looking ramp so that it doesn't just cheesily disintegrate from left to right. I might just stick to a top to bottom projected ramp, but this means I have to lay out the UVs of the manual perfectly.

I converted my paint effects trees to meshes. This is both a blessing and a curse. It's a blessing in that I can get my paint effects, depth, and beauty all in a single pass. all rendered in mental ray. The side effect, though, has been catastrophic in that it very often causes the batch renderer to crash. The mrBatchAnim script doesn't even work that well. Most of the time, I have to say a prayer and hope that the script doesn't create out of memory errors. Looking at the task manager, I can see Maya taking nearly 1.7 GBs of memory, whereas when rendering without the trees it rarely gets above 1 GB. This is a major problem, and to me it is the one that will make or break the project (if I can't render, I obviously can't produce a finished product. To that end, I'm going to give in to nearly everyone's suggestion and put some of the trees on textured cards. I've always hated the look of textured cards and have done everything in my power to avoid them, but if I can't render I'm going to have to give in. Textured cards. Blech.

I tried to tweak BSP settings. My tweak caused the render to go from 4 minutes per frame to over 30 minutes without anything rendering. I must not understand BSP settings as well as I thought I did. It's sad because that's something I think could save a lot of time, but how much time should I invest trying to get the settings correct?

The dragon is finally complete! For the most part, anyway. After a few problems with skin weights flying apart and files corrupting themselves, Adrian delivered a finalized dragon model. He's really cute! It has made the initial splash shot more difficult because it's hard to make the cute dragon look large, but I'm happy with the model. Adrian helped rig the mouth head and neck, since I was having a lot of problems having never done that kind of rigging before. I rigged the feet, claws, tail, and eyebrows. Besides a couple skin weight problems, I'm quite happy with the rig. Steve Matthews helped me sort out some of the skin weight issues, and now most of the deformations are at least acceptable. I spent about a day blocking the new dragon into the beauty shots based on the old dragon's animations, and it's looking quite nice. Cidalia did a wonderful texture for him, one that I am planning on deriving a translucency map from. Hopefully that will produce a cool effect for the splash shot where the dragon comes out of the well.

Dragon texture:



I bought two new effects books and have been scouring them for solutions to my effects problems. Eric Keller's book on special effects is kind of interesting because it is project based, and uses standard Maya features in unconventional ways. The ambient occlusion effect used to create a scanning effect for a skull was quite clever.

The second book was the standard maya Special effects handbook from Autodesk, for Maya 2008. The project files are Maya 2008 files, but with 8.5's new ignore version check box (under the Open scene option box), they worked fine. There are some really really cool effects in that book. It turned me on to the use of hardware particles as a viable solution. Previously, I had always thought of hardware particles as a nuisance. They look bad, and require an extra pass because points, streaks, spheres, etc. cannot be rendered in Maya software. Two things are required for awesome looking hardware effects, from what I've gleaned off of the special effects handbook. First of all is motion blur, which is easy to set in the hardware render settings. The second is the color accum checkbox on the particleShape attributes. This wonderful attribute causes the color of hardware particles to be additive, which is awesome for fire effects, explosions, and any kind of energy effect. After seeing some of the demonstrations from the book, and creating a couple of my own, I'm totally sold on using them as a separate pass for the splash effect. Many people I've talked to said that the splash can have some kind of a magical quality to it, since story-wise it does involve magic.

I should also talk about the disastrous industry critique. This is the second of the three critiques we have to go through, the first being a faculty critique that for me was very successful. I was quite surprised that the first critique went so well, and I expected the second critique to be much harsher. But not THAT much harsher. First of all, our monitors are calibrated incorrectly, so everything appears too bright. Thus, after color correcting, everything turns out too dark. This was especially a problem on the projector, for which I couldn't even show my project it was so dark. The industry panelists, one the head TD for lighting. He basically said I didn't have enough time to finish, and that I needed to really focus on getting the effects to look right. It was especially discouraging when he said it would probably take the rest of the semester to get one of my effects up to C- level, never mind the rest of the project. 5 weeks to do a spash effect? If anything, that's encouraging in that the time frame seems reasonable, unlike that Dreamworks presentation when the modeler said she churned out a NURBS noodle cart in half a day (something that would have taken a month for me).

Oh well, at this point it's just a matter of bringing everything up to passable level. I'm confident I can do that, but what I'm not confident in is the ability of the computers to render.

That's all for now! I've got another journal entry planned in about 10 days or so, so hopefully I'll be on schedule till then. It's gonna be a big push to the end of the semester. Pray for my renders!

No comments: