top of page

PART 3: RENDERING / EDITING

Rendering is the component of animation of processing each frame of a scene and putting all of the frames together into a video. Rendering is a very time-consuming process of animation, especially if the computer utilized for so has trouble processing a lot of 3D information. I happen to have late 2013 iMac, which isn’t an animator’s ideal computer - especially for the task of 3D rendering. However, it’s not the worst in the world. It’s still able to process a long scene, only crashing on occasion if the file is corrupt or if there is too much to process. It’s still very slow and, by itself, is not very reliable to use to render an entire animated film. The more characters, vertices, lights, and effects used, the more time it will take to render a scene. Eevee is Blender’s first real-time engine, which means you can see a semi-rendered scene in the viewport without rendering an entire frame. This is a great way to render scenes that do not need as much accuracy and detail as the Cycles engine, which would take almost 100x or more time to render out. Though Eevee can process a scene and it’s information much faster, it still takes a good amount of time to render out a scene into a video file. With this, Eevee takes more to process and if the computer used is overloaded with information, it can freeze for a while or crash altogether. I began the process of rendering solely with my iMac, which at first wasn’t too hard. The first scenes don’t have much too process, and these 14 seconds took around a day to render. But when the scenes with much more complex meshes, characters, and lighting were up to render, my computer struggled to process all of their information. On some occasions, the projects would overheat my computer and force me to restart my system. I knew that since I only had about a month left to render the entire animation, I had to find ways to render more at once and speed up the renders I have at home. To do this without losing any information or detail, I had to find some solutions quickly.

 

One extremely efficient solution to part of this problem was finding more computers to process my renders. I looked into hiring an online “render farm”, which is a collection of computers in another location that can render a scene for you, and send you the finished render very quickly. However, they are difficult to set up, can be expensive, and not all of them support Blender users. So I decided to find computers of my own. At first, this idea seemed very easy to accomplish, as I just need to find more computers to use. But that was only the first step. I decided to ask my Video Production teacher, Mr. Granbery if I could use 5 or so computers in the computer lab to render my scenes after school ended each day for the next month. He agreed, and I began to carry my Blender files from home on a hard drive to the school to install into the 5 computers I was using to render. However, many issues occurred at the beginning of this process. The first was that I was unable to install Blender onto these computers, for the high-security settings on the iMacs didn’t allow for the installation of programs by unidentified developers. So, to work around this problem, I went inside the files of the Blender program and copied all of its contents. I then made a new folder, dumped all of the copied contents in the folder, and renamed it too “Blender.app”. I then made sure that this folder would convert into a ".app" file (which an iMac reads like a program file). Doing this made it seem as if I made the program, for I pasted its contents and created the ".app", and the computer allows the program to open. I then sent this ".app" file to all of the computers I was going to use for rendering, and they all opened efficiently. After resolving this issue, I installed in all of the computers the scenes I planned to render. I set each computer to render different intervals of the animation; for example, if a scene was 1000 frames long, I’d have the first computer rendering the first 200 frames, the second computer rendering frames 200 to 400, the third computer rendering frames 400 to 600, the fourth computer rendering frames 600 to 800, and the fifth computer rendering frames 800 to 1000. Once I set these intervals up for each computer, I rendered each part of the scene overnight. However, when I got back to check the animation the next day, the scenes would turn up totally pink. Buildings were pink, characters were pink, everything had this pink texture to it. I then remembered I had to start “packing” my Blender files at home. The reason everything was pink was that the textures I used were all images I downloaded or made on my computer, yet these computers didn’t have the same files. In Blender, when a texture is applied to a mesh but the file cannot be found, the texture is replaced with a bright neon pink color. This tells the animator, “HEY, THERE IS NO TEXTURE HERE.” So, I went home to pack all my files. I went to school to retry rendering these scenes and came back the next morning to see if they were rendering successfully. Yet, there were still some issues. A few computers rendered the scenes efficiently, but some crashed before exporting the final scene. If a Blender file crashes before rendering the last desired frame, the portion completed of the video is not recovered. So, I had two solutions to this problem. I decided to start to use more computers and set smaller intervals for each render. However, I also used a unique way to export each render. Instead of exporting each section as a video format, I exported the scenes as PNG sequences. This means that even if a computer crashed, it’d save each individual frame as a ".PNG" (an image file), and I would only have to re-render the scenes that didn’t make it past the computer’s crash. With all of these problems fixed, I now had my own render farm working to render on my files after school for about a month.

 

​

​

​

 

 

 

 

 

 

 

 

 

 

 

 

 

 

​

​

​

​

Though I had an efficient render farm working, I still wanted to render on my computer at home, and find ways to speed the scenes processing in Mr. Granbery's room. To do so, I came up with methods to reduce the magnitude of the files I was using. The first method I used was first copying a scene multiple times of all its objects and information and setting each copy to render a different interval of the camera's actions. After doing this, I would delete all of the objects and lights that aren’t in view of the camera of the shot and then render that shot individually. This would give the computer less information to process, and help it focus on whatever the camera is solely focused on. I did this for almost all the shots rendered on my computer.

 

I also had to use much more technical and creative problem-solving tactics to speed up my renders. One example of doing so involves a scene I created in a graveyard, where the background of the scene took up so much memory and information that my computer would either crash - or take days to render only a few frames. As I only had a week to finish rendering this scene, I was struggling to find a way to finish this scene in time. Right as I was going to purchase a render farm online to finish the scene for me, I had an idea. In the scene, the girl and the tombstone are in focus while the background in the far distance, filled with trees, bushes, and rocks, is blurred (from the harsh depth of field). However, this background was taking up a majority of the vertices in the scene. So, I duplicated the file and erased all the objects except the meshes. I faced the camera in front of the background with a high focal length and removed all of the real-time effects and nodes set up in the scene. I then activated a function in Blender which makes the background completely transparent and exported a picture of trees, rocks, and bushes as a PNG. I then went into my original scene, deleted the background objects, and created a 1-face plane. I added the image I exported to it, with a transparent background. I placed it near where the background previously was and rotated it to face towards the camera. So now, instead of using high poly models as a background, I had a four vertex plane with the image of the background. Because the background is so blurry, it’s almost impossible to find the differences between the real meshes and the image plane. This essentially acts as a play with props. Instead of using real trees which take time to place on the set, stage builders create cut out paintings of trees to emulate real trees. It was extremely pleasing to find this workaround and execute it efficiently.

 

 

 

 

 

 

 

 

 

 

 

​

 

 

 

 

 

 

 

 

​

​

With about a week left, I pretty much had every scene rendered in their entirety, after redoing some scenes that were processed incorrectly. I could finally move onto the editing process of the film, in Final Cut Pro X. Editing is another very creative process of animation. Many would assume that not much editing is required for animation, however, it’s essentially the same as editing film. You have a collection of scenes and shots which you haven’t seen put all together, and you must merge them as one with correct timing and action. Color correction and visual effects are still necessary, as well as adding sound and music.

 

To begin the processing of editing, I first added all of my rendered shots in order into the timeline. I watched over the entire film’s animated scenes constantly, editing and cutting each shot. Once this step was completed, I began adding music to each scene. I was cautious to add the songs where the clips were in exact sync with the beat. I then began to download various sound effects, including ambiance, cloth rustling sounds, and other sounds that played a role in the movie. The ambiance really helped enforce the environment of the scene. I had ambiance for outside scenes, interior soft noise for room tones, and even small foley like a ceiling fan whirling. The cloth rustling sounds were played when characters would move in any way - which really made them sound as if they were really moving. After watching the film and adding more cuts, I found problems with the instrumentals I used. Other than the happier instrumentals, I never felt like the moodier instrumentals were really adding to the scene. I tried making more variations, but still just couldn’t connect with them when watching the movie over. However, to my surprise, I really felt more when I took out the instrumentals, and all that was audible was the soft ambiance of the environment. I decided to only keep the instrumentals that exaggerate more exciting scenes, and during moodier times in the film have, at the most, have soft pads playing the back. After mixing all of the sounds by EQing, leveling out each sound, and adding master compression, I began to edit the visuals. Before adding color correction and visual effects, I first had to fix some rendering issues I had with masking tricks and transitions. In some scenes, I had to fix background problems or character mistakes by masking over parts of the clip with additional media. After fixing these small problems, I moved onto the visual effects.

 

I first gave each clip their own distinctive color correction effect. As I watched over each scene, I constantly changed their values in brightness, contrast, color, and saturation. I kept watching over the movie until each scene had an efficient color palette respective to the mood of the story. I made sure to give happier scenes a warmer temperature, while sadder and darker scenes kept a colder temperature color palette. After finishing the color editing portion of the film, I began adding the visual effects. The first effect I added, one I had been anticipating to add, was the noise filter. This filter adds a layer of white noise over a clip, and you can adjust how much noise you want and how it interacts with the clips. It gives a nice grain effect, often found in videos shot on film. I really layered this effect on my final movie, because the noisy effect really makes each shot look as if it was filmed with an imperfect camera. I also added a prism effect, to help the colors pop, a soft sharpen effect, which gives more effect to the imperfect camera quality, and a soft saturation and a contrast boost.

 

After watching the movie about 40 times, making small edits here and there, I finally had my finished film. It took about half an hour to export, and I delivered it to my Video Production teacher in the last 20 minutes of the submission time. The film took 6 months to write, design, animate, render, and edit.

renderfarm.jpg
thumbnail_maclab-libraryhpbanner-900x391
imageedit_3_4880451880.png
trees.png
bottom of page