When building a rig utility nodes like mulipyDivide or reverse are a handy way of setting up functionality of controls. But after you've setup an IKFK blend for what feels like the hundredth time you start looking for a way to automate connecting up all those attributes. 

If you go to the pymel docs and search for how to create these nodes you probably wont find what you where hoping for. While the latest docs have information on the nodes themselves there aren't any good examples like there are for most of the  other pymel commands.  

Since the docs aren't helping lets go back to Maya and look at what is output to the script editor when we make these nodes by hand. Open up the hypershade or node editor and make some of the nodes we want. You should see something like this.
 shadingNode -asUtility reverse;
shadingNode -asUtility multiplyDivide;
shadingNode -asUtility vectorProduct;
Looking at these they are all made with the shadingNode command. If we go back to the pymel docs we can find a command that matches up in rendering/shadingNode. The examples are still pretty sparse but essentially we use the command like this.
 import pymel.core as pm
pm.shadingNode('[NODETYPE]', [NODECLASSIFICATION]=True)
For [NODETYPE] we enter what node we want as a string, in our case 'reverse' or 'multiplyDivide'. [NODECLASSIFICATION] is needed to tell maya where to put this node in the hypershade. We use as asUtility=True since we are making utility nodes, if you are making a light or shader you should use asLight or asShader respectively following the docs. So when we fill those things in our code will look like this.
  utility = pm.shadingNode('reverse', asUtility=True)  
The command returns a pynode which contains the node made which we can store in a variable in this case utility. With our node safely in a variable we can use it however we want.

Now that we have our utility nodes how do we hook it up? 

To do this we have to look into pymel's attribute system which I'll be getting into next time.
 
 
So I'm going back to siggraph this year as a student volunteer again, and this time I have more stuff to show while I'm there.  Not only do I have my short film and the start of a TD reel, but a visualization I programed is going to be displayed outside of the Emerging Technologies and Art Gallery room. How cool is that!
Picture
hey look a screenshot
So all the booths in that room have QR codes next to them that people can scan with their phones and I took that data and tried to figure out where people are moving to.  The screen shot is just using random data to run since the conference hasn't started yet but I imaging the real thing should look just like that but only more awesome because it will be at siggraph! 

Its written in Processing which I hadn't used before but was pretty easy and fun since its just a flavor of Java with more graphics going on.  I had fun doing it and it was a nice review since I got to use some programing concepts I haven't used since sophomore or junior year at ISU.
 
 
I wrote this tool so I wouldn't have keep doing the same things over and over when making my blend shapes.
Its pretty easy to add new stuff just make a new subclass and make sure to implement 3 methods, then add a button to the right side for it. Its on my maya scripts bitbucket as blendshapetoolkit.py
The things the tool does are.

Shrinkwrap: 
The first part I wrote because I wanted my blendshapes to keep the cylindrical nature of the helmet.
It has the option to select only the verts that I want to affect

Arrange:
In order to keep things organized it moves the newly created shape to the next spot.

Materials:
Put a new material on the object with the option to only affect certain faces

Mirror:
For symmetrical meshes mirrors the shape node and moves it to  -x

Blendshape:
Adds the shape to the blendshape node of the target object

not shown in the video are the ability to rearrange the order and to save and load the list.
 
 

not quite getting the range i was hoping for.  I used a log scale to get the values to be reasonable in maya and it kinda killed the peaks, but we'll see how it works when I get the audio its going to be used on.