No keyframes were used to create this: it’s a MIDI file recorded from my keyboard, stuffed into a mad CHOPs network in Houdini to handle all the timings and animation automatically. Depending on how loud a note is supposed to be, the timings are shifted to match how a player would actually play. Louder notes usually mean the key is hit faster, so the animations have to be slightly shorter. The key releases – how long the animation of each key “returning” to its upper position takes – are based on a number of factors, but mostly weighted toward what the overall frequency of notes is at that point. So slower passages have a more relaxed feel visually. And so on and so on… 😉
A corporate project from a few years back: a narrative animation, character-driven, with cel-rendered characters within a 3D world. Several times we used crash cutaways to help explain parts of the story: little vignettes as if taken from other animations, all with very different visual styles.
All created in Blender, After Effects and Illustrator. The animation is for corporate use, so I can’t show it here, but I can show some snaps: Continue reading “Gallery: one of the most visually diverse projects I’ve worked on”
Send a camera (or other animated object) from Houdini to Fusion. This is a work in progress, but it works, so I figured it’s worth making available in case anyone’s desperate. If nothing else it’s useful example code for anyone trying to export data from Houdini. Continue reading “Houdini to Fusion camera animation exporter script [U]”
A handy little depth of field visualiser. Hang it off the bottom of a camera object, and it’ll create a couple of frames in the scene showing what’ll be in focus.
The calculation is based on a circle-of-confusion (well, maximum blur) of 1 pixel, so a deeper field may appear to be in focus, but it’ll depend on the subject matter. You’ll have to render a frame to see exactly how the focus looks and feels, but this is a particularly handy way to check what the camera’s focussed – especially if the camera’s moving, or if you’re pulling focus between objects.
The visualiser lets you know when you’ve hit your virtual lens’s hyper-focal distance – the nearest focus-distance at which objects at infinity first come into focus. Handy when you want everything in the background in focus, but the foreground as defocussed as possible. Whack up the camera’s focus distance, then pull it down until the “Far” distance is just hitting infinity.
- The asset has no controls; it picks up all the resolution and lens information automatically from the camera it’s linked to
- The visualiser doesn’t work with Camera Switcher objects, but you can drop a separate visualiser onto each camera if you like
- The frames and numbers won’t appear in renders, but once you’re happy with the camera’s focus you may want to turn off its display flag to stop its calculations from slowing down preview playback.
Download (CC-0, use and abuse): com_howiem__dof_visualiser__1_0.hdalc
Unlikely anyone but me needs this, but just in case: a Houdini script to copy camera animation to the clipboard as a Blender-flavour Python script. Select a camera (or camera switcher) in Houdini, run the script, go to Blender, create a new text block and hit Paste. Execute it and Boom! there’s your camera, all animated an’ stuff.
Suspect most folk wanna go the other way, but I’ve a stupidly complex object already in Blender that wouldn’t be trivial to export, and my scene’s in Houdini. Lots of zany animated texture stuff going on as well… why recreate it if Blender can render it happily? Just need the camera to match up with the rest of the scene.
I’d been going round the houses, exporting to AE first, then from AE to Blender, but focal-length and DoF settings weren’t making it through, so this is an improvement.
Always weird, though, writing a Python script that generates … a Python script. ‘Specially when you’re running it in one 3D package, with its own data structures/methods, and it has to produce a script for a different package with different names and concepts for everything
hey ho…. github gist linky
New page: a useful wrangle for when you need to render lots of tiny glowing particles, sparks, fireworks, UI stuff, star-fields etc.
Thought I’d better try writing this up as it took me bloody ages to get my head round it.
Situation: you have an object that you want lots of instances of, all with different looks. Say, one tree object, but you want lots of instances with variations in their colouring.
The approach differs depending on whether your tree object is in the scene file somewhere (ie instanced using s@instance=”/obj/my_tree”) or if it’s a file on disk (s@instancefile=”$HIP/geo_trees/my_tree.rs”).
Instancing a scene object
You can use either point attributes or stylesheets to poke new values into the material parameters: see https://docs.redshift3d.com/pages/viewpage.action?pageId=12419786&product=houdini for guidance.
Instancing a proxy file from disk
This is slightly more tricky: you must use stylesheets, and you can only poke new values into parameters that are exposed at the material’s top level VOP node. I’ve created a hip file to illustrate:
Download: RS per-instance colours_001.hiplc
To test it, just make sure you “render” the Proxy ROP first, to create an .rs proxy file for the scene to use. It’ll create an RS proxy containing this single plane with some nasty coouring:
Then you can go to Render view and render an image.
You should see this lurid mess:
But that means everything is working. Every plane has a different hue shift going on, even though they’re all identical instances. If it wasn’t working, you’d see this:
This is how it works: to vary the material, I’ve stuck an RS Color Correct node in the material, and an RS Multiply node to multiply an incoming 0-1.0 value by 360 (turns out the hue shift parameter on a colour correct node wants a value in degrees, which caught me out for a while):
That parameter name is important—hue, in this case—as that’s the thing we can bind and alter on an instance-by-instance basis using a stylesheet override.
So—into the stylesheets: to get to them, open a Data Tree pane, and choose “Material Style Sheets” from the dropdown. I’ve added a Style Sheet Parameter, and a Style, a target (“Point Instances”) and two overrides:
The first override sets the material to the one that’s in the scene (as opposed to any that may be saved in the proxy file itself). You can only override materials present in your scene, not ones within the proxy (as far as I can tell).
The second override is an “Override Script” – even though it’s not really a script: you can choose “Material Parameter” and “Attribute Binding” from the dropdowns. The Override Name is the parameter that’s exposed on the material—”hue”—and the Override Value is the point attribute you want to bind it to. In this case, my Instance object has a bunch of points to instance to, each with an @hue_shift attribute.
And that’s it. No reason to only use this for shifting hues: you could create a material with a shader switch, or a texture, and as long as you expose the switch parameter or the texture filename string in the top VOP level of the material, you can poke new values in at render time like this.
One caveat/limitation: Redshift is blazingly fast at rendering loads of instances. But if you try tweaking 1000s of instance’s materials like this, you may find that the pre-render processing, as Redshift builds the scene, could become quite time-consuming. I’m guessing that behind the scenes Redshift is creating a new shader for each tweaked material. So for huge scenes with loads of instances, you may need to take a different approach to adding variety. Stay tuned.
Should be called “Point Looker” really. Super simple really, just a time-saver if you keep need certain looks, which I found I did. Plug some points in the top and this will create Alpha / Cd / pscale attributes. Set colour from a ramp, or from a base colour and then hue/sat/val variance; Alpha over age or from a custom attribute; Size based on mass, or a ramp based distribution.
And a “Create specials” option, which takes some proportion of your particles (best not too many) and gives you some extra controls. Handy for making a few odd differently coloured or extra bright particles to spice up the mix, like in the top image.
Colours can be cycled through the ramps too, to add a bit of life.
No guarantees, may not function as advertised, may burn down your house. But it’s reasonably well documented if you dive inside, and most of the parameters have hover-text help. If you’re happy with that, download: com.howiem__h_particle_look__1.8.hdalc