I have two models that are identical other than the faces. However, when I apply a blendshape, in addition to morphing between the two faces, motion is created in the legs of the target. These are FBX files exported from MB 2011 into Maya 2011.
I was following a digital tutors tutorials mudox to maya creating facial expression. im trying to add my blendshapes manually how they did it in the video by applying the blendshape to a group but its not working at all. i get a error message that says blendshapes cant be applied to groups. how were they doing this this in the video.
I'm trying to use the wrap deformer on an old rig (made with The Setup Machine 2 - non-joint influences). The new geometry I created has blendshapes applied to it. The problem I'm having is that anytime I change my blendshape values and even when just keying a value, I have to wait about 5 minutes before I can do anything else in Maya. Once I have placed just one key, any movement on the timeline has me waiting again. I cannot work like this at all, and goodness knows how it's going to affect batch render time.
Is there any workaround that doesn't involve me skinning my new geometry? I'm working on a tight deadline and I really can't waste any time.
how to reorder vertices in a flipped head so blendshapes will work properly (I've modelled left side expressions and would like to just mirror them across if that's possible).
How to create inbetweens for corrective blend-shapes and reversed shapes.
For example if we have 2 poses, "lidClose" and "browDown" with 3 betweens for each. It's not hard to create the corrective and reversed blends but it's very confusing to create inbetweens for reversed shapes.
Although logically it looks simple but it's been impossible for me.
We have an animal character mesh for the body and head together, there's not a good place to separate the head for blendshapes without showing an obvious seam.
Is there a way to just copy over the polygons for only the head to model into blendshapes? I don't want the entire mesh to copy over for blendshapes - I'm concerned that making so many blendshapes from the entire mesh would take up a lot of memory and resources.
translate facial motion capturing data (basically 2d point clouds) to blendshape weights. I finished rigging my character and wanted to drive my blendshapes by the relative position of my tracking data, e.g. distance between corners of the mouth drive the "wide" blendshape and so on...
Hope YOU know a solution for this, I believe blendshapes are THE way to create realism in tracked animation but I was not able to find ANY reference to something like that. I'd like it to be customizable and not to expensive, so the imagemetrics faceware service won't do for me
Was wondering if there is a way of getting random camera movements? For example, i am doing a fly through of an earthquake scene and as the camera animates along the street it randomly shakes with the earthquake shocks.
I know I could animate it by hand but I wondered if there was a way (perhaps an expression) to make it more simple?
I've got a Locator attached to a Curve with a Motion Path - the Motion Path is animated so the Locator moves nicely along the Curve. I now want to swap the original Curve with a new one and have the Locator follow it instead.
Is there a way to simply switch curves and tell the Motion Path to use a new Curve while keeping the animated keys in the Motion Path node?
I've tried playing around with the connection editor but cant manage to connect the new Curve to the Motion Path nodes "geometryPath"
I'm currently working on a advertisement for a client and I need to creat a kind of ribbon to wrap around like a present. I thought what would be best would be making the ribbon and then using a motion path add it on but also I would add a gravity field to the ribbon so to moved more like a ribbon.
Also with the motion path I have made them and connected my ribbon and for some reason it doesn't move until the last frame and just snaps to that position?
To create an animation similar to the one shown in the page 225, in the tutorial, I created a curve but I am not too sure that was the correct way to create it.
1. Selected the Surfaces Menu.
2. In the font view, using the CV Curve Tool, created the first half.
3. Using Duplicate in the Edit Menu, I created the other half.
4. In the Edit Curves Menu, selected Attach Curves and connected two curves.
5. I, then selected the Animation Menu and created an object to be used as an aircraft and followed the same steps shown in the tutorial and it worked nicely.
I am trying to attach an object to a motion path but when i do the object goes to the beginning of the path but when i move the time slider nothing happens and when i move the object it isn't attached to the path.
There is no motion in my meshes when using set key or auto key.
So lets say I have a cube and set a keyframe at 1 and move it a little to the right and set a keyframe at 13, between 1 and 13 there's no motion from 1 to 13, just it jumping straight to 13. I hope that makes sense, I've tired going to Animate setkey options and checked current manipulator handle, didn't worked.
Trying to find the closest thing to 3dsmax trajectory, I'm using a motion trail. Once deleted the specific controller it was used for becomes very very slugish, with over 20 sec to respond.
I tried deleteing all the connections in hypershade and using the delete command but nothing works.
a few things involving cameras. My assignment was to use our plane that we modeled with poly's and animate it on a motion path(used cv curve tool) and have a camera above it follow it. Now I got that down perfect.
But what I don't have a clue on, is how do I attach a camera on the same motion path as the plane and have a camera angle right behind the spinning propeller? Like a cockpit view.
I was wondering if there is a way in maya 2013 to import like 20 takes at the same time, work on those 20 takes in one maya file and export them simultaneously just like motion builder and his "export all takes, save one take per file, use take name".
I'm asking this because i am thinking of switching to maya, but this motion builder native feature is just essential to me.
Any way to evenly distribute Locators on a motion path along a nurbs circle? Its for a mouth/lips rig. and I need it to be symmetrical with out manually inputting values.
Once i set up a motion path for my object to follow, it sets the object to follow it over the entire timeline. what if i just want to object to start later on? I want it to raise up, hover over to the path, and then be attached to it.
i have been using the drivenTime node to create walkcycles that follow motion path curves where the out time is connected to some attribute via a unitToTimeCOnversion node.in previous version of Maya.
When i try to connect anything to the Out Time of new drivenTime node the connection never works, not even keyframes?
I have a list of motion capture, those mocap didn't track the fingers so the hands are always opened. In some of these mocap I need the biped hand to be closed. I've been looking for a solution for about a day now. I think I'm in the good way which is to create a biped layer animation where the hands are closed and link the hands of that layer to the original layer. so I would be able to save that layer and apply it to every mocap animation that need to be readjusted. The only problem is that I'm not sure if it's the better way and I don't know the procedure.
I just installed the demo version of maya 2012 / 32 bits to test the new editable motion trail that's what I waited for, but I couldn't manipulate the in and out tangents for a specific key they seem to be frozen can't move them.
If I have "dummies" 1 - 3 positions animated, is there a controller I could add to the curved object to rotate and position it in respect to the dummies? I guess the dummies are acting like motion capture points.
I am tracking points on a moving object, it doesn’t deform, it tracks nice. Using Matchmover. Back in Max, it sees it as a static object, and the camera is moving around it.
Any way to invert the animation between a camera and an object so the other one is the one that moves, but they retain the same relational animation? Did that make sense? Right now the object (represented by a Group of 3D tracked points) is still but the camera moves, I want this opposite, the camera is still but the tracked points move, and it looks the same through the camera.
I have an animation of a wine glass breaking. I simulated it using mass FX and particle flow. I will import it to After Effects, add finishing touches and render out a super slow mo clip like you would see on Discovery HD or Time Warp. In essence, I want it to look like it was shot on a high speed camera.
I tried a test render out of max at 960 fps and importing into AE at 25fps but the results aren't quite white I want.
My question is, what frame rate do I render the animation out inside 3ds max and what frame rate do I import/interpret it in After Effects?
I'd like scale my biped animation. I have 200 frame mocap animation, all frame has keys. I want scale it to 72 frame. I found how can I scale the keys, is ok, but after scaling the bip01 bone jump there and back.
I have rigged a character and made it into a character set in which i have locked the scaling attribute , I have then saved this file. I have then referenced this file and and have done a walk cycle animation and saved this file. I have then realised that I needed the scaling attribute when I have added other parts of my scene. Is there a way to unlock the scaling attribute in my walk cycle animation file or will i have to go back and change it in the rigged file, create a new character set and do the walk cycle animation again?
My current scene has 10 animation layers for a character, and seeing every f-curve from all layers, although grey and locked, when I only need to see the f-curves in layer I'm working on is really cluttering the graph editor. My current work around is selecting my active layer in the Layer Editor every time I select a rig control. Simple enough, but a pretty inefficient way of working. I can't seem to find a setting that allows me to just see ONLY the fcurves in the active layer. I imagine some of you out there may have hundreds of animations for a single game character.
I'm trying to animate a moving rope based on simulation data. I know how to import my motion data for individual objects.I've seen lots of tutorials where one uses simulations or soft bodies with IK in order to create a rope. how to use all these features and I want the rope to follow my coordinates exactly. I was hoping I could do something like the following:
1. create points in Maya which correspond to my data points 2. connect the points with a curve 3. loft a circle using the curve as a path in order to make a rope 4. move the points. due to History, the rope position is updated
Is there a way to do something like this in Maya? I've tried, and it looks like ep points do not have key-able channel data, so I can't create animations for them separately. I'm using .mov data file imports to generate x, y, and z motion keys for the points.