The cage node allows us to track which polygons of an object are outside another object. In the provided example video the list of polygons is piped into deleteComponent node for procedural deletion.

example video


Voronoi procedural 3D texture node.


Using the (geometry) retarget node we can relatively transfer the shape of one geometry object onto another. There are multiple built-in methods that solve many general and specific cases. The node can be used as a standard or "relative" wrap deformer, uvBlendShaper, or mixture of both.

On the example video the yellow shirt and blue pants on the walking character are cloth sim. The green shirt and blue pants on the dancingmodel are basically the same cloth motion but relativelly transfered to accommodate for the different body motion.

The retarget node can also be used for transfering of facial blendShapes between two heads with different topology and proportions while preserving the facial expressions and their specific details for example.

example video


The smooth node smoothes geometry using Laplacian algorithm. There is option for boundary preservation and two methods for volume preservation (fast and accurate).

The fast method is useful for objects with "simple" topology - notice on the third video some verts in the eye corners and ears are starting to missbehave. The accurate method will take care of that, but on the price of additional calculations.

example video 1
example video 2
example video 3
example video 4


Maya PaintFX is very powerful L-System but lacks the ability to instance custom geometry to its elements. To fill the gap we can use pfxToArray node to extract the paintFX data for further modification and custom usage. In the first video we selectively read subsets of points. In the second video, we pipe the extracted data into geometry instancer using arrayToDynamicArray nodes.

example video 1
example video 2


The rayProject node projects point clouds (meshes, curves, surfaces and particles) onto mesh objects. There are multiple options for precise control over what gets projected, where and how. A subset of the effects that can be produced with this node are also known as shrink-wrapping.

In the first example paintFX tree gets projected onto mesh sphere.
In the second example - Game engine like shadows - set of polygons projected onto the ground surface that resemble the shape of a moving character. An interesting twist is that part of the animation where the shadow polygons morph into the character and then go back in shadow mode.

example video 1
example video 2
example video 3
example video 4


The peak deformer can do miracles if you need to make the blobby looking nParticles mesh more liquid like. On the left side we have standard nParticles mesh, on the right side is the same geometry but with peak deformer applied to it.

In case liquid sims are not your "forte" and you are still not clear what's the role of the peak node in the example above - here is another (simpler) one for you where the shrinking effect is exagerrated - lower image. Again standard nParticles mesh on the left and peak+smooth on the right.

example video
SPH sim by Ivan Turgeon - PFVE.


Morph is fast, multithreaded and memory efficient blend shape deformer that can handle thousands of targets with ease and without degrading performance. It can be used for pre-deformation blend shapes and post-deformation corrective shapes.

Key features:

  • Fast, multithreaded computation
  • Primary, inbetween and combination targets
  • Interactive per-frame caching for even better performance
  • Per-target world, dag_node and surface transformation spaces
  • Per-target inMesh connections
  • Paintable target weights
  • Comprehensive Python API
  • Streamlined GUI

Kwai Bun provided some excellent videos showcasing some of the main Morph features:

video tutorial - general workflow
video tutorial - procedural modifiers
video tutorial - performance

video tutorial - debug mode and schematic view (nodal graph)


Shot-modeling is important part of every project that involves character animation. For complex character shows, it often becomes one of the most important pivots in production.
This GUI and the underlying API are designed to simplify and streamline the shot-modeling workflow from both artistic and pipeline standpoints.


A whole new approach to working with blend shapes. This is a comprehensive toolset that can handle huge amount of blend shapes with ease.

Key features:
  • Loading/saving of targets and split maps to/from disk
  • Inverse targets
  • Auto-generation of "derivatives" for combinations
  • Targets presented in 2 sections - primaries and secondaries (inbetweens and combinations)
  • Flexible workflow with hotkeys and mouse actions via overloaded Maya widgets
  • Associate split maps at any time to any targets
  • Powerful tools for transferring of targets between objects with different shape and topology
  • Compile data and bake it to a Morph deformer for high-performance
  • Complete Python API
  • Plug-in API for seamless integration of custom tools and into existing pipeline
video tutorial


The words "delta mush" are popular these days. Here is how it is done the SOuP way - using smooth+morph nodes.

video tutorial


The scatter node can generate point clouds on the surface of mesh geometry or inside its volume. Here a scatter node creates points inside the volume of a deforming mesh geometry. This point cloud can be used in conjuction with pointCloudFluidEmitter to emit fluid from the entire volume of given geometry and not just from its surface.

Same for pointCloudField - we can affect dynamic properties of objects using the entire volume of object, but not just its surface points. Don't forget that we can transfer point attributes from the mesh surface to the point cloud using the attributeTransfer node - things like point colors, point velocities, etc.


We can block out areas by painting weight maps (vertex colors) on the source mesh geometry, this way we can control where the scattered points go. In this example the scatter node creates points on the surface of deforming object. Notice how the point cloud is forming only around the white areas.





Here we have a mesh cube with two faces deleted. The scatter node still figures out what the volume of the object is like and does the right thing. Also the scatter node has a feature that allows us to generate points only within specified range from the geometry surface.






In general, geometry is never prepared for fluid emission. Modelers model things based primarily on rigging, animation and lookdev needs. So we end up with too many, too few, or irregularly placed points.

In this example we have a box with 8 points only. If we decide to use the standard Maya fluid emitter, we have two options:
- emit from these 8 points - pretty useless
- emit from the entire surface

Here we can use fractal or bitmap textures to control the emission process, but they do not allow for localized control and do not react on other events in the scene. The solution is simple - we can use scatter node to resample the geometry. The result is regularly placed points on the surface of the object, inside its volume, or both.

Then the point cloud can be piped directly into a
pointCloudFluidEmitter node or go first through attributeTransfer node that can assign additional bits of data such as emission rate, density, fuel, temperature, color, etc., for more precise control over the emission process. Notice how the cube gets filled with points and the emission happens from the entire volume, but not just the poly faces or vertices. Also, there is a local override of the color emission in the right corner.

scatter + projectors

Using texture based distribution we can precisely shape the scattered points in many different ways, including "boolean" operations from multiple projection planes, textures and UV Sets.
As you may know already the scatter node can be used to directly drive particles, geometry instancers and procedural shatter nodes. With the texture based distribution we gain complete control over the scattered points and in this way over the mentioned above systems.



Basic projection








Boolean projections








Boolean projections + texture masking (checker + grid in this case)








Source geometry uv based

The scatter node has inPositionPP attribute that can be used to supply custom point cloud to it and in this way to bypass the generation of points internally. Many interesting effects can be achieved by supplying vertices, particles, voxel or pfx data to the scatter node for post-processing - for example - uniform filling of objects (as shown here).
Also, the scatter node distance to surface data for each point - notice how in the two provided two examples the voxel colors get yellow when deep inside the object and dark when close to the surface.




Data flow:
fluidAttributeToArray extracts voxel positions from the fluid container and passes them to the scatter node. The scatter node strips all points outside the mesh object. Remaining point positions and distances to surface data gets passed to pointCloudFluidEmitter node that uses them to emit fluid properties into the container.
In the provided examples the pointCloudFluidEmitter is in attribute transfer mode, which forces the container to resamble the shape of the input geometry.

With this technique we can easily achieve the best case scenario for fluid emission - always in the center of the voxels.

example video 1
example video 2

Scatter nodes can be used to drive particles in a procedural manner. This way you don't have to rely on dynamic simulation if you want to stick particles to geometry for example. You can freely scrub the timeline for and back and things will just work.
Here baked point cloud is driving meshed nParticles to create the effect of mud sticking on the character. Peak deformer is used to offset the points of the generated mesh along their averaged normals to make it look more like liquid.

ComputeVelocity node calculates the velocities of the baked point cloud then attributeTransfer node passes them to the mud geometry. If you render with motion blur turned on, you will see that even the mud geometry is changing all the time the motion vectors stay consistent.

Basic example showing instancing of "sprites" to scattered point cloud on the surface (left) and inside an object (right). AttributeTransfer node is used to properly orient the instances along the normals of the box vertices.

example video 1
example video 2


Using the shatter node we can shatter mesh geometry, being it static or deforming.
It relies on input point cloud generated by scatter node, particles or nurbs curve.
Then voronoi cells get calculated and geometry is cut on the boundaries.
In the example scenes (as on the shown videos) the shatter nodes are in "auto evaluate" mode, but generally you will be using the "bake result" button located inside the shatter node's AE. This way we get the shattered geometry only when needed. The shatter node can generate solid or surface shards.

example video 1


Here I animate first the number of points within the point cloud - the output reacts accordingly on the fly. Then I animate the distance between the different shards.
Finally I increase the resolution of the sphere.

Additional nodes can be used to further refine the shape and distribution of the scattered point cloud. This way we can precisely place or remove points.
In this example attributeTransfer and bounding object nodes influence the positions of the scattered points. The red points are the original point cloud, the blue ones are the post modified positions. Notice how the shards react on that - the closer the points the finer the shards. Only one bounding object is used here, but you can use more if needed.

example video 2


Shattering of deforming geometry. Notice how the shards stick to their relative positions. The trick here is to pre-cache the input point cloud coming from the scatter node. Inside the scatter node's AE is located a button that will allow you to bake the point cloud to nurbs curve. Then you can deform that curve with the geometry and feed it into shatter node.

The nMaxCutPP attribute drastically improves performance by limiting the lookups needed to create a new shard to the closest n points. The denser the input pointCloud, the bigger the performance improvement is.
Lowering too much the value of this attribute may lead to artefacts - such as overlapping shards. In this particular example, setting nMaxCutPP to 30 resulted in 2.5x shorter time needed for shattering the entire geomtry.

example video 3

Shattered geometry in action.

example video 4






Again we are using a combination of baked shatter objects from SOuP, convert to nCloth mesh using default setting and then transform constraint all our vertices so the nCloth remains static in space. Now we can feed the mesh through an attributeTransfer node and with 2 bounding objects - set one to envelope the whole nCloth and set the weight to 1, then the second one to a weight of minus 1. This one will be used to break per vertex constraint. Now we connect outWeightPP to per vertex glue/strength on the nConstraint.





This is a more involving example - here we have local shattering of geometry that grows over time. We split the data flow in two separate streams and combine them at the end.
First stream is used to remove all ground faces that do not interact with the dancing character.
The second stream is used to generate scatter mask (point colors) so we get points only where the character touches the ground. Notice that here we use the original ground geo, but not the one from the first data stream, where we remove faces at each evaluation step. This way we ensure tatic shards. Then we plug the scattered point cloud and the remaining faces into a shatter node to get the desired result.

example video 5


ComputeVelocity calculates point velocities of the dancing character and stores them in array. ArrayToPointColor converts this array to point colors. AttributeTransfer transfers colors from character to ground plane (hidden here) based on proximity between their points. PointCloudFluidEmitter emits fluid properties only from the area where character contacts the ground, and the fluid is colored accordingly.

example video 1



ComputeVelocity calculates the velocity vectors of each point of the geometry before it gets modified (the original moving teapot). AttributeTransfer node transfers these values to the final geometry, so even the point count changes over time we still get consistent motion vectors. Using the remapArray node we post-modify the velocity data. The velocity vector array gets converted to a set of pointColors by the arrayToPointColors node and in this particular case it is named "velocity". Finally the modified teapot mesh's attribute "motionVectorColorSet" points to that "velocity" colorSet and passes it directly to the render.

example video 2


How to render changing point count geometry with proper motion blur? Easy.
In this example we have particles falling over moving teapot. BoundingObject is passing the particle positions and radiuses to group node. The group node collects the face ids around the contact points where particles collide with the teapot surface.
This componentsList gets passed to polySmoothMesh and deleteComponent nodes.
The polySmoothMesh subdivides the faces to get more resolution, so when the deleteComponent node does its thing we get round holes.

example video 1


How to render changing point count geometry with proper motion blur? Part 2.
This is a more involving version of the example above. In addition to everything from the teapot setup, here we have group nodes that collect the boundary faces of the tearing surfaces. They pass the inverted componentsLists to deleteComponent nodes that are plugged to separate meshShapes - so we always get the boundary faces no matter what is happening to the upstream geometry. Then we emit particles from these faces. This way we get blood particles only where and when the geometry gets torn. Using this simple approach we can eliminate a lot of tedious work by hand needed to ensure proper particle emission from the right place and at the right time.

example video 2


Notice on the rendered video how even the point count and order changes we still get everything properly motion blurred. I used only one collision sample here, that's why some pieces get stuck inside the knifes, and the blood could look a lot better. Good enough for a fast'n'dirty example.

example video 3

interactive caching system

The interactive caching system (ICS) is designed to improve the viewport perofrmance of deforming geometry with consistent point count over time.

Once applied to deforming objects it automatically begins to operate by tracking input conditions and caching internally geometry data for each frame we step on without any further intervention by the user which results in a fluid workflow.
If the input conditions don't change when we step later on the same frames dependency graph evaluation is bypassed and the internally cached data is used instead.

The system is ideal for complex rigs, heavy geometry and slow to evaluate nodal networks because it step on disk (slow) but but uses the system memory instead.

example video

  • In the first part the raw rig performance is shown
  • ICS gets applied to the rigged geometry and hooked to the rig controls
  • After the first go through the frames the performance improvement is over 7 times
  • One of the controls gets animation change, the frames that get affected by the modified animation curves fall back on the raw DG evaluation, but the second time we step on them things are fast again


BoundingObject reads particles positionPP, rgbPP, radiusPP and feeds group and attributeTransfer nodes with them. The Group node has an option to store componentsList and objectGroup data for previous and current states (by default it considers only the current state). This data gets passed to deleteComponent node that deletes faces from the leaves geometry. AttributeTransfer node slightly attracts the leaves around each particle and recolors them (in red - all particles in this example are red). As result we get an "acid rain" effect.

Mind, there is no transparency hack or anything like that. It is all procedural geometry manipulation.

example video 1

Procedurally delete geometry. Group node collects the face ids inside the bounding object and passes them to deleteComponents node.

example video 2






BoundingObject in pointCloud mode reads particle positionPP and rgbPP attributes. AttributeTransfer node transfers them to the ground surface. Alpha channel is modulated by "alpha" ramp attribute located on the boundingObject node - that's how we get multiple circles around each particle. Transfering of point positions produces the "swimming" effect - each particle attracts ground points around itself.

example video 3


Point node randomizes grid points in the XZ plane and assigns to them random colors. AttributeTransfer node transfers the colors to another plane. The result is a Voronoi noise.
Here we "project" it on a flat plane, but it can be used for things like fracturing objects with complex topology.

example video 1




You can achieve the same result by simply spraying particles around.

example video 2


Bound node creates sparse voxel grid around static or deforming geometry with consistent or changing point count and order - a walking character in this case.
The blue wireframe is actually a mesh shape with "display shading" turned-off.

Bound nodes can be used for effortless "down-rezing" of complex objects for simulation purposes. example video 1 shows out of the box simulation of the proxy geometry (1300 points) generated from the original tree (31000 points). example video 2 shows simulation of the original geometry. Notice the frame rates.

example video 1
example video 2
example video 3


PolyCylinder is deformed by wave deformers and its position is animated. PointAttributeToArray node passes the point positions and tangents to pointCloudField node. The tangent vectors are interpreted as velocites and are applied to the particles. Second pointCloudField node attracts the particles around each mesh vertex so they do not escape when pushed by the first pointCloudField.
Using pointCloudFields we can use any geometry or custom arrays to control dynamic objects in ways that are hard to achieve otherwise.

example video


This node measures how much the geometry stretches or contracts. There are multiple color coding methods. In this case red is compression, green is neutral, blue is stretching. You can use these color maps to control wrinkle, muscle, veins and whatever other maps you may need for your characters or other things.

There are two modes - distance based (shown here) and in-between angle based. The first method measures distances between points (edge lengths), the second one measures angles between edges - this is useful when we have deformations without stretching/contraction - for example bending of skinny elbow - points get closer, but their edges keep same length.

example video


Manage your python and mel scripts the easy way - execute, source/import, edit, tweak attributes in the UI, argument presets, etc.


This emitter node brings lots of flexibility to the table. Extract point clouds from meshes, curves, surfaces, particles, paintFX, fluids, etc. and supply that on the input of the node and fluid will be emitted accordingly. Fluid attributes will be emitted according to the data on the input.

In the provided example videos: Particles move through fluidContainer, pointCloudFluidEmitter reads their positions and emits fluid properties in the voxel grid.
In this example only the particle positions are used, but in addition you can feed the pointCloudFluidEmitter with per-point radius, density, heat, fuel and color (optionally - from specified colorSet). The node can use pointCloud (arrays), swept geometry or regular mesh, surface, curve or particles as input. As mentioned - in this example we keep things simple - just positions.

example video 1

To make things better we slap a pointCloudField node that uses particle positions, radius and velocities to push the fluid in the desired direction. As a result the fluid looks a lot more like a flamethrower. On a similar note - the pointCloudField can use pointClouds (arrays), swept geometry, mesh, surface, curve or particles as input.

example video 2



Using this node and a bit of creative thinking we can apply deformers to fluids much like any other geometry in Maya.

example video 3


Texture based fluid emission is good but often we need more precise control over what we emit and where. In this example textureToArray node converts animated ramp texture to point colors. AttributeTransfer node uses boundingObject to override the texture colors in specific area of the surface. In this case we do it for colors, but it can be anything else - density, fuel, etc. PointCloudFluidEmitter picks the final colors and emits them into the voxel grid.

Using similar techniques we can build very precise and flexible fluid emission systems.
For example, we can emit fluids based on the tensionMap values from the example above, or if you look at the example below, we can procedurally apply multiple textures based on proximity to (complex) geometry or pointCloud and then emit fluids based on that. Throw some extra boundingObjects in the mix to override/block/edit things and you get some pretty interesting stuff going on.

On a similar note:
Take a look at the fluidAttributeToArray example - there one fluidContainer is used to emit properties into another fluidContainer.

example video


TextureToArray node converts animated ramp texture to point attributes (in this case - per-point weight), this weight data is passed to peak deformer. The peak deformer offsets points along their averaged normal. This effect can be used for many things - static or animated wrinkles, liquidish looking deformations, bulging flesh, etc.

In the first example we have and local override of the weight map calculated by the textureToArray node. So we don't get bulging for the points that are inside the boundingObject.

example video 1

TextureToArray feeds a peak deformer with pixel values from noise procedural texture. Second peak deformers makes the blobby "mushroom" effect. AttributeTransfer adds point colors.

example video 2


Using the trajectory system you can non-destructively manipulate animation paths directly in the viewport. With "nondestructive" it means that you can work simultaneously in the graph editor and with the trajectory's manipulators in the viewport and the animation curves will always be intact. If you change something in the graph editor the trajectory stuff will automatically update in the viewport and the opposite - you edit the path in the viewport the animation curves in the graph editor will update according to that.

The displayAttributes node allows you to display attribute values in the viewport.
This is very handy when you want to debug things during playback or when you simply want to display things around.

example video


We can cook complex objects (meshes, curves, surfacec, etc) at different times - effectively offseting them in time. In this example we have a running character, inserted between the skinCluster and the visible geometry is a timeOffset node which has its offset value animated.

example video


The fire component of the simulation exists in the small fluidContainer only. FluidAttributeToArray node extracts the voxel properties from there (in this case position + density only) and passes them to pointCloudFluidEmitter emitting smoke into the big fluidContainer.
Using this technique we can split the main elements of the fluid simulation (in this case - fire and smoke) between different containers for more precise and independent control over simulation and shading.

example video 1


Always wanted to be able to "voxelize" geometry and render it that way?
ComputeVelocity calculates the point velocities of the dancing character and passes them to a pointCloudFluidEmitter node in attributeTransfer mode. At each step the pointCloudFluidEmitter will empty-up the fluidContainer before emitting fluid properties, effectively transfering attributes from input pointCloud or geometry to the fluid.

example video 2





Basic stuff. Painted point colors that get emitted into the fluidContainer by the object.

example video 3


MultiAttributeTransfer feeds a cluster deformer with point weights based on proximity between character and ground geo. The closer they are - the stronger the weight is.
Point radius, falloff ramps and other attributes can be controlled globally for the entire set of points or through weight maps for localized control.
The cluster handle is translated on -Y - that's how we get the ground deformations. You can use the peak node to offset points in the same manner for complex geometry
(it will do it based on point normals instead of globally for the entire object like the cluster).

example video 1


Similar to the example above, but in this case we "remember" the contact areas between character and ground plane. Mind, this is not the regular soft body trick - it is all procedural - no dynamic simulation involved.

example video 2






MultiAttributeTransfer allows for localized control over deformer weightsMaps.
In this particular case we have 4 blendShape targets applied to a head geometry.
Each boundingObject is connected to multiAttributeTransfer node that controls the point weights of one of the four targets. The same result can be achieved by using attributeTransfer and arrayToMulti nodes. That's why there are two example scenes supplied.

example video 3

Notice that blendShape weightMap attributes (much like the skinCluster's one) do not react on "dirty" flags. That's why there is a point node at the very end of the chain that has 4 getAttr lines in its pre-loop section to force-refresh the blendShapes.

Credit goes to the guys at Cinemotion for providing the head geometry and blendShape targets for this example.

Component texture attachments in Maya are based on object groups. Using the group node we can control interactivelly that otherwise implicit system.
In this example we have a ramp texture with cranked up noise attribute assigned to couple of objects. There is a character walking around them that has a different texture assigned. Based on proximity we "transfer" the ramp texture from different objects to the character geometry.

example video


Basic "Summer and Autumn leaves" example where attributeTransfer node transfer point colors from boundingObjects to leaves geometry.

example video

As you may know it is very difficult to query scene data from within particle expressions - basically nobody is doing this because the performance hit is huge. There are no out-of-the box tools that bridge particles with the rest of the scene other than colliders and force fields. Using SOuP nodes you can easily do any of that.

In this example particle positions and velocity gets altered by the point normals from another object in the scene. As result instanced geomtry orients along the vertex normals. Using this approach we can make insteresting effects by making particles play nice with the rest of surrounding them objects.

example video

Combination of point, peak, arrayDataContainer and computeVelocity nodes can produce interesting motion based deformations.
This example shows the very basic of the idea that can be easily extended to achieve much more complex and refined results.

example video


Using arrayDataContainer nodes we can create interesting effects like wetmaps or accumulated damage. Generated data can be used to drive blendShapes (example above) and texture maps.
AttributeTransfer node transfers per-point weights from fighter geo to the static guy.
This data gets passed to arrayDataContainer node and then to arrayToPointColor one. MentalRay vertexColors texture pipes it into a shading network where it is used for blending between two textures.

example video 1


The arrayDataContainer node has attribute called "sink". At each evaluation step it sinks a little bit of the stored in the node data, creating a "wetmap" effect.

example video 2






PositionPP, radiusPP, rgbPP, weightPP get transfered from particles to the water surface. As result ripples form around every particle that hits the water. Peak deformer displaces the ripple points along Y. Also pointCloudFluidEmitter emits fluid properties from the white areas of the ripples.

example video 3






Here particles transfer weight over to the nCloth meshes via the arrayDataContainer which maintains the values over time allowing fluid emission. The pointCloudFluidEmitter gets its PositionPP from a pointAttributeToArray and the inDensityPP comes from the arrayDataContainer.

example video 4





Very similar to the above method except the reverse is happening here with the weight transference. The emitting mesh already has a weight of one but as particles land on its surface, the contact points turn black which prevents fluid emission, hence we can "put out the fire" so to speak.

example video 5

With the ability to now invert our weight transference, we can now pipe a boundingObject's weight value through the nComponent node of a dynamic constraint and use it to control a weld's weight per vertex attribute. So we could zip and unzip things or cause breakages in constraints using particles for example.

example video


Maya provides a simple way to "emit" geometry using particles, but there is this nasty cycling that happens to the geometry when the particles start dying. Also, there is no way to propagate per-particle attributes to the geometry points.

Here is how we create this effect the right way:
PointAttributeToArray nodes extract particle positions and map them to the idIndex arrays. PointCloudToCurve nodes get this data and create nurbsCurves. Loft node creates polygonal surface, attributeTransfer maps the particle colors to the polySurface (optionally we can transfer and velocity for proper motion blur). Ramp controls the opacity of the surface along its length. Render :)

example video


Auto-generate multiple nurbs curves from provided point cloud, live garbage collection, etc.


This is a more complex example showing procedural instanced feathers. Scatter node generates points randomly placed on the character's geometry. AttributeTransfer nodes properly adjust their normals that will be used later to orient the instanced feathers.
Data gets collected and passed to the instancer node by arrayToDynArrays nodes.
As result we get a guy fully covered with feathers.

There are actually two of these systems in the scene - one for the body and another one for the scalp (the big feathers). Notice how unlike instancing to particles you can scrub the timeline for and back and things just work.

Using similar approach we can easily create things like objects built from lego bricks for example. Finally, don't forget that the instancer node has built-in LOD, where we can display the full res geo, bounding boxes only or nothing. Very useful when things start getting heavy.

example video 1

Using arrayToDynArrays nodes we can build kDynArrayAttrsData structures to control the geometry instancer nodes in a procedural manner without the need to go through particles and expressions.

The next example fluidToArray node extracts the fluid properties and passes them to few arrayToDynArrays nodes that feed an instancer node. As result we instance geometry to fluid voxels. We can map voxel properties to instances in many different ways. In this i tried to keep things simple:
voxel density - instance scale
voxel velocity - instance aimDirection
If the voxel is empty (density = 0) the related instace gets hidden for better performance.

example video 2

This time using 3D fluidContainer and instancing of multiple objects. If you check the example scene, pay attention to how the multiple instances get randomized using fractal texture and textureToArray node.

example video 3

Not sure how to name this effect, but for now it goes by the name of sparse convex wrap.

example video 1
example video 2


The rayProject node can be very useful to create permanent collision deformations. In addition we use point node to apply vertex colors based on amount of deformation.

example video


Instancing made easy! SOuP provides powerful tools for geometry instancing but the workflow is demanding. InstanceManager wraps it all in a simple GUI and straightforward workflow.

example video

video tutorial


Python scripting as integral part of the dependecy graph.
Programatically and/or procedurally manipulate transform objects much like how SOuP operates on geometry level.

example video 1
example video 2
example video 3


There is a simple way to turn any particle shape into a point cloud container reacting on input events. Here one particle shape influences the size of another one.
All examples by Sergey Tsyptsyn

example video 1





Transfer point colors from geometry to particles.
Remember how hard it was to make particles react on surface properties from surrounding them geometry. Well, not anymore.

example video 2






AttributeTransfer node influences the radius of particles passing through boundingObject.

example video 3







PfxToon color transfer to particles.

example video 4







Nucleus lacks one very useful feature we enjoyed in the old rigid body solver
- collision detection. SOuP brings it back online.

example video 5






Another example of procedural control over particles from external events
- notice how the particle colors always match the animated texture of the surface underneath.

example video 6

audioToArray (maya audio node)

Drive transform nodes or procedural networks in Maya with data from audio files
(wav, aiff, aifc, snd, stk). Particles, fluids, geometry generation or deformation, etc. all can benefit from this versatille node.

Interpolate nurbs curve through the audio bands, feed the result into a peak deformer that offsets the points of the curve. Revolve a nurbs surface to visualize in 3D.

example video 1



Represent the audio bands with scaling transforms.

example video 2






Deform polygonal sphere with audio data. Colorize vertices according to amplitude.

example video 3

mapToMesh & meshToMap nodes

Using these two nodes we gain ultimate control over the UV points. Convert the uvs to mesh, apply deformers, animate, reposition vertices by hand or using other procedural approaches, then convert the final result back to UV points.


ArrayToTexture2D converts on the fly array data to standard 2D texture that can be plugged directly into any shading network. This gives us the ability to drive shaders interactivelly and make them react on events happening at geometry level.
In the example images attributeTransfer node generates point weight map using boundingObject. ArrayToTexture2D node converts that to texture data feeding displacement node.


Particle emitter node that uses point cloud data as source for the emission. This approach provides all the freedom, flexibility and precision one may need. Much like the pointCloudFluidEmitter node we supply point cloud data (extracted from meshes, curves, surfaces, particles, fluids, paintFX, etc) on the input and emit particles according to that. The emitter node simplifies the often tedious job of managing PP attributes using standard methods like expressions and/or ramps. It can directly propagate rate, position, velocity, mass, lifespan, radius, rotation, color, opacity, 5 user scalar and 5 user vector attributes - all inherited from the supplied input data.

Control particle emission by using bounding object. Vertex normals are used as velocities. Particles inherit the vertex colors.

example video 1


Particle emission from mesh surface with color inheritance.

example video 2





Emit particles from fluid voxel grid. Particles inherit voxel velocities. Voxel densities controls emission rate. Voxel densities are piped to the particles as userScalar1PP attribute that controls ramp attached to the rgbPP attribute - that way we recolor particles according to the densities.

example video 3





Making force shield effects is much simpler now.

example video 4


Easily create and manager large number of animation sliders and selectors.
These objects can be used to control any element or group of elements in your scene. Facial animators often use similar tools to streamline their workflow. This tool adds lots of additional features and flexibility. For more information read the help tab.

Included is a "takes" system that allows making of "snapshots" for all/selected sliders and applying back a percentage of the takes to the sliders.


Maya's version of zSpheres.
How it works - select joint and run the "bmesh" command from the soup shelf. All joints under the "root" one will be used to form continous mesh surface. The mesh generator is fully interactive - manipulating or deleting existing joints or adding new ones instantly updates the mesh. The "volume" of each joint is controlled by its radius attribute.
Big credit goes to Michael Tuttle for sharing his working Maya version based on Justin Ardini's open source project (
Modified open source code is included in the SOuP archive.

example video

Interesting example combining group sets and delete nodes together with particle and fluid emission to create interesting effects.
Examples provided by Jeremy Raven

example video 1





Combining point cloud FX emission with peak deformer and Maya's standard cluster, ideal for hero wave FX, can be achieved at the fraction of the time it takes to run a bifrost simulation.

example video 2





Similar to the above method except now SOuP is used to immitate mesh collisions.
In addition, the pointCloudField uses a subset of mesh points to drive the FX emissions along the vertex velocities of the wave mesh. The particles then have the luxury of resting on the ocean surface when not in range of the pointCloudField.

example video 3


A complex node for dealing with mesh shells. It can extract per-shell data - points, normals, colors, weights, radii, component ids, bounding box.
In addition it can control mesh shells with point cloud supplied on the input by remapping each point attributes to the corresponding mesh shell.

example video


Draw connections between points based on proximity. The node provides a lot of control over how the connection lines are drawn - input point cloud attributes can be mapped to the lines - color, transparency, thickness on both sides and along the lines, etc.

example video 1




Precise controls over thickness and color along the length of the lines.

example video 2






Offset ramps allow to displace the links along their length independently in XYZ.
This way we can create spiderwebs and other interesting shapes like the one provided
on the image on the left.

video tutorial


Blend between different targets based on surface tension.

example video


Most high-quality head rigs need system that mimics sticky lips. The solutions usually end up being cluttered and messy large networks of nodes, constraints, expressions, deformers, etc.
This StickyLips system provides alternative that consists of just one generic node (stickyCurves) that takes care and hides all involved complexities. As a result the viewport performance gets a boost and technical artists have one thing less to worry about.
Simple API is provided to allow for easy integration into any scripted rig systems.
Included is GUI that streamlines the interactive workflow.

example video


Similar to the tensionMap node, but here we can provide explicit list of point pairs to calculate tension from. Each tension value can be remapped by corresponding ramp. Included are additional options for limiting, absolute values, etc.


The bound node can generate mesh cage around any geometry, but mesh generation is expensive for computation. It does not make sense to go through that if we need point cloud representation of the (sparse) voxel grid.
The combination of voxelGrid + pointsOnMeshInfo can generate dense voxel grids quickly and easily. Like everything else in SOuP this is a live data generator that provides many options for managing the data flow. For example - trim voxels located away from the base geometry, extract surface properties like colors, normals, uvs, etc and propagate it to the voxels.
Using arrayToDynArrays node voxels can be piped to Maya's Instancer or SOuP's Copier and rendered as arbitrary geometry.

example video 1
example video 2


Copier is extremely powerful node that allows us to copy-stamp mesh geometry. It supports per-instance time offset, handling of vertex colors, uvs, soft/hard edges, shader attachments, etc. The node outputs data as a single mesh object or instancer data.

video tutorial

example video 1
example video 2
example video 3


Another very useful node provided by Alex Smolenchuk. It performs uniform scattering of points on mesh objects. These points have associated to them normals, tangents, etc. properties inherited from the underlying surface.


This node offers plenty of options for constrainted or conforming tetrahedralization, cellularization, triangulation, convex hull, tessellation of arbitrary mesh geometry or point clouds.

video tutorial 1
video tutorial 2
video tutorial 3
video tutorial 4
video tutorial 5
video tutorial 6


Shape the silhouette of mesh objects from given viewport perspective in an intuitive and effortless way.

example video


A powerful toolset for geometry collisions.

Key features:

  • High-quality collisions and bulging
  • Speed - extra care was taken to ensure maximum performance in every case
  • Complete control over every aspect of the workflow - global settings, per-deformed object overrides, per-collision object per-deformed objects overrides, list goes on
  • Tight integration with the rest of SOuP nodes - normal/membership/weightmap modifiers, etc
  • Well structured UI makes all features convenient and easy to use

And the best yet - you will rarely use any of them! Intelligent algorithm takes care of all details in the most cases.

example video 1
example video 2

video tutorial 1
video tutorial 2

reaction diffusion

Generate organic patterns on the surface or inside the volume of mesh objects.

video tutorial

point cloud builder

Quickly and effortlesly create and edit point clouds.

video tutorial


Geometry reconstruction of arbitrary mesh geometry, oriented and unoriented point clouds.

example video 1
example video 2
example video 3
example video 4
example video 5

Some exquisite graphic examples from Firas Ershead using the remesh node.

smart connect

Highly efficient, comprehensive and flexible connection editor.

video tutorial


Resolve mesh self intersections with ease.

video tutorial


Find the shortest path between mesh vertices.

example video 1 (uniform growth)
example video 2 (non-uniform growth)
video tutorial


Quickly and effortlesly create and modify tiled geometry.
This toolset provides all the bells and whistles one would expect from a modern day solution - fully procedural, decoupled tiles from grids, randomize position/rotation/UVs, conform to a non-flat walls, "cutters", Python API, etc,.

video tutorial

mesh fitting

Powerful solver and flexible toolset built around it provide us with the ability to deform any given mesh to closely approximate (fit) the input geometry, regardless of their topology.

example video
video tutorial


Iterate on just any data set and accumulate the results over multiple evaluation steps of the dependency graph.

example video: solver + group + extrude
example video: solver + group + extrude
example video: solver + resample + resolvePointCollisions
example video: solver + resample + resolvePointCollisions
example video: grow on surface

Check out some other cool solver animations by Peter Larson here.


Simple, yet very powerful tool for hard-surface modeling. Provides the ability to adjust the affected components (vertices, edges, faces, CVs, etc) of the applied geometry modifiers at any stage of the modeling process. Think about it as a hands-on-procedural modeling workflow.

video tutorial

human anatomy

Included in the "scenes" downloadable archive is a complete high-quality human anatomy asset with multiple LODs, that can be used as a starting point for muscle simulations, reference, etc.
Credit goes to Database Center for Life Science Research Organization of Information and Systems.

surfaceFlow / pointCloudFlow

These two nodes estimate the local differential quantities of mesh surfaces and point clouds. Generated data can be used to control instances, hair, dynamic effects, etc.

David's head
hair - model the cap, instance, done
scales that don't twist or flip during deformation
Monet's impressionism


Procedural generation of skeletons from arbitrary geometry.

example video


An alternative viewport selection feedback system that is easier to work with in many cases compared to the standard one.

example video

soup-dev LLC ( /