And, here is the matchmove video project that I demonstrated in class. Next week we will do a very cool, very original matchmove project! Don't miss it!
Video Compositing is a technique for blending computer generated video with real video from a camera.
Camera Matching is a technique for matching the viewing angle and viewing distance to your computer generated (CG) animation to the view in a photograph or video so that, when you composite your animation object and shadows to the photo or movie, all the angles of the objects and shadows look just right. Camera matching is importantif you want to make your picture-compost or video-composite look convincing. In class we just matched up our CG objects and shadows to the picture by looking at it. But, the 3D Studio Max software has built in tools for doing that.
Matchingmoving is a computer technique for looking at a video clip from a video camera and figuring out the camera's viewing angle and its distance to the subject and for tracking the camera's motion and shake during the video clip. All that camera motion tracking information is then fed into your CG animation so the viewing motion and shake of your animation matches up perfectly with the live video. In class next week we will use a free matchmoving software program to do a very cool project.
Here are the steps that I used to make the matchmove video shown above:
Matchingmoving is a computer technique for looking at a video clip from a video camera and figuring out the camera's viewing angle and its distance to the subject and for tracking the camera's motion and shake during the video clip. All that camera motion tracking information is then fed into your CG animation so the viewing motion and shake of your animation matches up perfectly with the live video. In class next week we will use a free matchmoving software program to do a very cool project.
- I started out by making a short 10 second video of a rug in my living room using a home movie camera.
- Using 3D Studio Max, I created a biped skeleton animation and manually matched the lighting in my biped animation to the lighting in my video clip.
- Next, I rigged the Kim Possible character to the biped skeleton.
- Then I used a freeware program called "Voodoo Camera Tracker" to calculate the camera motion-tracking information from my video clip. I downloaded the motion tracking data into 3D Studio Max and rendered out the animation video with shadows to a video file with a black background.
- Then, I used a video editor to composite the animation video onto the original live video to get the final video of the computer generated character dancing on the carpet.
In next week's class, I'll walk you through all these steps so that you can do this on your own.
In our class on Thursday, we did a project to introduce you to camera matching and compositing. We used the "Matte/Shadow/Reflection" metal ray material in 3D Studio Max to allow your 3D objects to cast shadows onto a transparent plane. And then we oriented the camera view of your 3D object to roughly match that of your photo. Here are a couple of your rendered composite images.
In our class on Thursday, we did a project to introduce you to camera matching and compositing. We used the "Matte/Shadow/Reflection" metal ray material in 3D Studio Max to allow your 3D objects to cast shadows onto a transparent plane. And then we oriented the camera view of your 3D object to roughly match that of your photo. Here are a couple of your rendered composite images.
No comments:
Post a Comment