Netflix’s animated series Arcane is a phenomenonally constructed visual tale. Left me quite emotional at the end, so I wrote a short piece of music to try and deal with it. Orchestral / short / build-up.
No keyframes were used to create this: it’s a MIDI file recorded from my keyboard, stuffed into a mad CHOPs network in Houdini to handle all the timings and animation automatically. Depending on how loud a note is supposed to be, the timings are shifted to match how a player would actually play. Louder notes usually mean the key is hit faster, so the animations have to be slightly shorter. The key releases – how long the animation of each key “returning” to its upper position takes – are based on a number of factors, but mostly weighted toward what the overall frequency of notes is at that point. So slower passages have a more relaxed feel visually. And so on and so on… 😉
Made this a while back, playing w AR stuff. It’s silly. Yes, you’ll need to permit camera/gyro, but it doesn’t do anything beyond being silly.
If you’re on your phone already, click here: https://howiem.org/ar/a
On a desktop, or another phone, point your phone camera at this:
A corporate project from a few years back: a narrative animation, character-driven, with cel-rendered characters within a 3D world. Several times we used crash cutaways to help explain parts of the story: little vignettes as if taken from other animations, all with very different visual styles.
All created in Blender, After Effects and Illustrator. The animation is for corporate use, so I can’t show it here, but I can show some snaps:read more…
Send a camera (or other animated object) from Houdini to Fusion. This is a work in progress, but it works, so I figured it’s worth making available in case anyone’s desperate. If nothing else it’s useful example code for anyone trying to export data from Houdini.read more…
A handy little depth of field visualiser. Hang it off the bottom of a camera object, and it’ll create a couple of frames in the scene showing what’ll be in focus.
The calculation is based on a circle-of-confusion (well, maximum blur) of 1 pixel, so a deeper field may appear to be in focus, but it’ll depend on the subject matter. You’ll have to render a frame to see exactly how the focus looks and feels, but this is a particularly handy way to check what the camera’s focussed – especially if the camera’s moving, or if you’re pulling focus between objects.
The visualiser lets you know when you’ve hit your virtual lens’s hyper-focal distance – the nearest focus-distance at which objects at infinity first come into focus. Handy when you want everything in the background in focus, but the foreground as defocussed as possible. Whack up the camera’s focus distance, then pull it down until the “Far” distance is just hitting infinity.
- The asset has no controls; it picks up all the resolution and lens information automatically from the camera it’s linked to
- The visualiser doesn’t work with Camera Switcher objects, but you can drop a separate visualiser onto each camera if you like
- The frames and numbers won’t appear in renders, but once you’re happy with the camera’s focus you may want to turn off its display flag to stop its calculations from slowing down preview playback.
Download (CC-0, use and abuse): com_howiem__dof_visualiser__1_0.hdalc
Unlikely anyone but me needs this, but just in case: a Houdini script to copy camera animation to the clipboard as a Blender-flavour Python script. Select a camera (or camera switcher) in Houdini, run the script, go to Blender, create a new text block and hit Paste. Execute it and Boom! there’s your camera, all animated an’ stuff.
Suspect most folk wanna go the other way, but I’ve a stupidly complex object already in Blender that wouldn’t be trivial to export, and my scene’s in Houdini. Lots of zany animated texture stuff going on as well… why recreate it if Blender can render it happily? Just need the camera to match up with the rest of the scene.
I’d been going round the houses, exporting to AE first, then from AE to Blender, but focal-length and DoF settings weren’t making it through, so this is an improvement.
Always weird, though, writing a Python script that generates … a Python script. ‘Specially when you’re running it in one 3D package, with its own data structures/methods, and it has to produce a script for a different package with different names and concepts for everything
hey ho…. github gist linky
Oh, Apple, oh Apple.
Not often do I feel the need to write about this kinda thing, but on a day (WWDC 2019 Keynote) when you managed to confound my expectations in so many positive ways, one of your decisions has left me in a state of bewilderment: that bloody monitor stand. I just don’t get what you’re trying to do.
Big changes at h Manor.
First, a client needed me to re-render some old projects – big projects (dome projection, 4K x 4K, around 10,000 frames), and half were created in Blender, half in Houdini / Redshift.
Second, I’ve moved most of my pipeline over to Linux. Mostly because Apple and nVidia really aren’t getting along, and that’s causing huge problems for people whose pipeline depends on nVidia GPUs. But also because Linux seems to be the OS of choice for larger studios, so it makes sense to get my head round it.
Timescales on the project are fairly tight, which means there’s pressure to deliver fast, but there’s also pressure to not screw up – and to not let the hardware / software screw things up. So, backups, fault-resilience, fast QC etc., all suddenly Very Important.
And because the project involves huge amounts of data (several TB), anything I can do to speed up the pipeline is good.
First step: I bought a server from eBay: a second-hand Dell PowerEdge R510 rack-mount monster; it was about £200, and it’s got 14 drive bays, two redundant power supplies, and it sounds like a jet plane is about to take off when you switch it on. I’m in love.
I don’t have a rack to mount it in, so it’s just here on the floor, sat sideways, but it’s working happily; it’s got 3 pairs of drives, each in a RAID-1 “mirrored” config, so a drive can fail without me losing anything, and when I plug in a replacement, the server will rebuild the data on it.
Yep – there’s an orange “warning” light on one of the bays – one of the drives was failing from new, but it turned out to be a good way to teach me how to rebuild a RAID set from the command-line. Though it’s Dell server with a Dell RAID controller, there’s a package called MegaCli that lets you remotely administer things. Lots of command-line switches to learn, but it’s sorted now, and apart from physically pulling out the dead drive and plugging a new one in, I did it all from downstairs. Freaky.
The server’s running Linux Mint (like everything else here). Not the ideal choice for a server, as it’s got a GUI / graphical desktop that I can’t actually see or use as I don’t have a monitor attached, but it’s good enough for now. And it turns out £200 buys you a lot of grunt if you don’t mind the industrial-size case it comes in: it’s got 32GB of RAM, 2 Xeon quad-core processors (same family as my Mac Pros).
But I need GPUs: the renderers I use for Blender (Cycles) and Houdini (Redshift) use graphics cards for their processing, which makes them less flexible but much faster at churning out the frames. So I needed to set up some render nodes to actually do the rendering.
I dug out some bits and pieces from various junk boxes and managed to put together two machines; they’re both fairly under-powered, CPU-wise (Core i5), haven’t a lot of memory (16GB each) but they can handle a few GPUs:
A bit of a mish-mash: two GTX 1080 TIs, two GTX 1060s, two GTX 690s, and a GTX 2060; plus there’s another two GTX 1080 TIs in the main workstation downstairs. I did have three GTX 690s, but two of them died in (thankfully) different ways, so I managed to cobble together a single working one out of their bones.
For someone who works with images, it’s kinda weird spending a couple of weeks looking at command-lines, setting renders going by typing a command, rather than clicking a button, but you get used to it. Gives you a strange sense of power, too. Rather than watching the frames materialise on screen, I get to watch the nodes’ progress reports in text. Strange.
Both Blender and Houdini were a pain in the arse to get going on Linux, though; I could get a Mac set up in about half an hour if pushed, but on Linux—and with a noob at the helm—they took a couple of days to sort out. Blender needed nVidia’s CUDA stuff installing, which largely consisted of installing and uninstalling and swearing and roaming help forums and more installing and uninstalling. But I managed in the end, and all without actually plugging a monitor into a computer; all done remotely by ssh.
Houdini and Redshift were a pain in a different way: you can install them perfectly easily from a command prompt, but unlike Blender they’re commercial products and need their own license servers installing and setting up too before they’ll work. And Redshift—really guys?—won’t let you actually activate one of your licenses from a command prompt: the licensing tool only works in a GUI. So in the end I had to dig out a monitor and keyboard. And find a VGA cable… I know I’ve got a bunch of them somewhere in here:
Finally found one, plugged it all up, spent about 20 seconds licensing Redshift, disconnected it all again. Finally, everything in the attic seemed to be talking to each other successfully… and even more thankfully, I now don’t have to actually go up there much; I can control it all from downstairs.
So: a server, two dedicated render nodes, three workstations, an old laptop acting as a queue manager, and everything working together; two of the workstations still running MacOS (for compositing and editing and admin/email) while everything else is on Linux.
It’s been quite a month. But the outcome is this:
… I can queue up render jobs, they’ll get farmed out automatically to machines as they become free, and I no longer have to get up in the middle of the night to set the next one going. At least, as far as the Houdini stuff goes; I’m still setting Blender renders going manually (albeit remotely via ssh) so I’ve got to sort out some scripts to do that bit a little more cleverly.