Tag Archives: Nuke

NAB 2014 Report

0

Posted on by

Hello VFX/PDX…

I’m fresh off a trip to Vegas for NAB 2014, and thought I’d share some of the more interesting things I saw, especially as it relates to VFX, Color Grading & Finishing. (NAB is the National Association of Broadcasters, and their annual convention is like SIGGRAPH, but for the entire industry, not just VFX.)

For those who don’t know me, I’m a Smoke VFX artist/colorist and general technologist at a post facility in Portland. The bulk of our work is TV Commercial based, so that’s the “lens” I’m looking through as I looked around the show.

(NOTE: Any opinions are those of my own, and are not endorsed by any of my clients, employer, or any of the vendors mentioned. Also – feel free to comment anything I may have gotten wrong – possible I may have missed some details or misunderstood something…)

4K

This was absolutely the year of 4K. Unless you’ve been under a rock, (or sequestered in a darkened room with a workstation) – you’ve probably been bombarded with press releases and industry news about 4K. 4K is the catch all term for the next generation of high resolution imagery – with 4 times the resolution of 2k/1080p HDTV. In broadcast, it’s also called UHDTV – Ultra High Definition TV.

(Note: technically, UHDTV and 4K are slightly different formats, like 2k and HDTV are slightly different – but everyone is referring to it as “4k” since it’s sexier and easier to say than UHDTV. You may also hear 2160p.)

4k-REZ

I am shocked at how quickly and thoroughly the production and post industries have embraced 4k. There are still quite a few hurdles to overcome before it goes mainstream – but almost every single camera at the show was 4k (or greater). And almost every single edit platform is now 4k ready – Premiere, FCP-X, Sony Vegas, and also more finishing systems like Smoke/Flame are now 4k ready (Scratch & Resolve have been for a while), and Avid says (or doesn’t actually say out loud, but whispers in your ear) they’ll have something by the end of the year.

Personally, I still think it will be a niche product to actually finish and deliver in 4k for quite while – it will be like the 2007 days, where some jobs had an HD finish, and some were SD – but just a few short years later, everything we do now at our shop is in Full HD 1080p. My guess is that 4k will take quite a while to catch on with the public, but it’s only a matter of time until all TVs are 4k – just like all smart phones now have high-resolution “retina” screens.

Sony-4K-NAB-2014-web-605x404 03_nab2014_booth-thumb-628xauto-227894 Panasonic-4K-World

(This last “4k World”  slide was a cheat – it’s actually from CES – but interesting that it’s being pushed so hard to the consumer show as well).

The big question is how does it get to the viewer? Netflix is now streaming House of Cards in 4k – but very few TVs are ready for it, and most home internet connections could not handle the bandwidth, which is about double what you need for HD (not 4x), due to better codecs – </aH265 or HEVC is the new standard for streaming 4k video.

This model is probably what will lead the adoption of UHDTV – the internet and streaming providers can move much more quickly than the broadcast industry, which would have to spend billions to upgrade their pipelines.

YouTube has allowed 4k uploads for a while, and they even have a 4k “Channel” – but again, sufficient bandwidth and h265 capable TVs are still quite rare.

Blackmagic showed 2 new 4k cameras in the $6000 range, and AJA had the big surprise of the show by releasing a camera of their own, the CION – very similar to the Alexa, but in 4k, and at only $9000, compared to 60k for an Alexa. (which is not even a 4k camera – hard to believe we once thought that looked good, eh? Wasn’t that what you were thinking when you saw all off those Oscar movies shot on an Alexa? “12 Years a Slave was ok, but I really think it needs another K or two.”)

Want to see the CION?

While 4K definitely feels (to me) like a solution in search of a problem, I don’t think it’s a fad like 3DTV. I think it will be more like 5.1 audio – sure, we can do it if we want to, but for many projects, it’s really not necessary – stereo is fine. It might turn out that 4k is the same way – high profile spots and cinema releases – sure, finish in 4k. But for most work? Probably not necessary – at least in the near future. It will probably continue to be broadcast/streamed in 1080, and then the 4k TVs will “uprez” it to 4k, and maybe if their screen is over 75″ – maybe they’ll see a difference.  Now, if only the cable companies would give the stream enough bandwidth to look decent, it might actually be worthwhile… But I digress.

The real crazy part about the new UHDTV standard is not even the number of pixels – that’s relatively simple to deal with – but there are also plans to transition to higher frame rate, higher contrast ratios, higher bit depths for more colors (10 & 12 bit) – that’s when multi-format delivery will be quite a challenge…) The new color space (BT2020 is the new Rec709) is actually even broader than the Digital Cinema space. The good news is that there will never be another interlaced format to deal with ever again. The new “spec” calls for frame sizes up to 8k, and frame rates up to 120p, but there are no “i” formats in the new standards at all. (Yeah! From a broadcast finisher’s perspective!)

DMColorRec2020_s

EDITORIAL

Adobe Creative Cloud has really taken off. There was literally no one showing anything to do with FCP 7, and probably 50% of the edit systems on display at vendors all across the show were Premiere and CC. There was still a fair number of Avids, and a few FCP-X, and a few PC only platforms (Vegas/Edius) – but Premiere was by far the most common. It certainly helps that it’s on both Mac and Windows, and it doesn’t cost $1500 to start using it – just $50 per month for most users – that’s for every app Adobe makes – Photoshop, After Effects, Illustrator, etc.

Adobe is also pushing a new tool for collaborative workflows called Adobe Anywhere. Imagine editing from a remote location, over wifi, not with lo-rez proxies, but with full resolution media. That’s what Adobe Anywhere promises. The idea is that the facility runs a very beefy central server, with multiple GPU cards and fast RAID storage or SAN (Storage Area Network).

anywhere-diagram

All of the edit systems become remote clients to the central server – and it’s the server that does all of the processing, not the workstation – the workstation is just giving commands to the server for how to build the edit, and the server builds it and on-the-fly renders an h264 stream to your edit system’s viewer window. The better your bandwidth, the better that stream looks, so on a decent internet connection or a LAN, it looks great. And on a low bandwidth wifi – well, it scales to look as good as it can – but since you’re always using the full rez media, you can pause on a frame and it will instantly update to full rez (when paused) to better judge the quality of a shot. And the “Anywhere” server handles all of the project files & permissions among multiple editors, etc.

It’s still pretty limited in many ways – you can’t link After Effects projects in the timeline (which is one of the best parts about Premiere), exporting files or XMLs is a real chore – but it’s an exciting development for sure. It’s also not cheap. For a group of 10 editors, I was told to expect to spend about $80,000 in server hardware (not including storage), and each user account is $1000/year. So – it’s certainly not for everyone, but possibly a glimpse into a new world of remote collaboration.

HP had a pretty big presence, showing they are still committed to the big box workstation and all of the power and flexibility that comes with it. Every single machine at the Avid booth was running on HP hardware, no Macs at all. Some vendors had Premiere systems with the exact same hardware as Flames (z820), and claim it far outperforms the new macs. Those workstations are not cheap – but with most apps now being cross platform, (other than FCP-X and Smoke on Mac), it’s nice to know you can still build a powerhouse system if you needed to.

z820_gallery_img1_tcm_196_1443302

The new Mac Pro (trash can/cylinder model) was also pretty prominently featured at many booths – including the DaVinci Resolve booth, which used to run their hero demo machine a a beefy Linux box – but this year was on the Mac Pro. It’s pretty awesome, and will only get better once more software can really take advantage of the power in that little tube, and its’ dual graphic cards.

Resolve-Mac

ProRes

ProRes is being pretty clearly adopted as the defacto standard delivery format for the broadcast industry. More and more systems (PC & Linux) can now create legit ProRes files. And while many of the new cameras are embracing “raw” shooting modes, many of them can now shoot directly into ProRes format. Funny to think that FCP has diminished in stature, but ProRes is flying higher than ever. Thank goodness those unintended gamma shifts are very rare these days…

 GO TO NEXT PAGE

 

I Have Seen The Future, And It Is… Slow

1

Posted on by

Deep data happy dance

Deep data happy dance

With today’s formal announcement by ILM & Weta Digital that OpenEXR 2.0 is finally pushed out for mass consumption, we can finally (there’s that word again!) do the VFX version of the Icky Shuffle.  Or whatever Deep data touchdown happy dance you’ve been working on since you saw the Deep data demo with The Foundry back at the Hinge Digital VFX/PDX blowout last fall.

If you were in hibernation at the time and missed it, or still haven’t had much exposure to them thar Deep renders – in a nutshell, Deep finally gives us a usable Z channel.  Old depth channels have always been a little bit of a hack, plagued by nasty per pixel sampling.  Even once aliased and cleaned up, you still commonly had to split renders into pieces and/or render holdouts to get everything jiving and edges that behaved correctly when composited.  But things that should work, like Z defocus, would instead wreck havoc and have you walking through a landmine field of broken edges, pops, sizzles, bleeps & blunders.  Deep data to the rescue!  Deep allows you to render layered CG uninhibited, in it’s full, juicy glory, and then let the Deep Z information take care of your holdouts and what layers in front of what, and it does this both correctly and (usually) flawlessly. Simply put – on a complex film like Avatar where traditionally you may have had characters running through the forest, and had to render characters running through the trees with holdouts here and holdouts there…  and then (zing!) they change a few frames of animation in the character pass — previously you’d have to rerender EVERYTHING because the holdouts changed as well.  As of today, those days are in the rear view.  Deep compositing solves those issues, and everything now works like it should.  You rerender the changed character pass, DeepMerge it with the existing forest renders and you’re off to the races. ILM and Weta were all over this because it’s the only way they could have finished a film on the scale and scope of Avatar.  If they hadn’t brought back Colin Doncaster & co. to finally nail down what they’d started back on ‘Rings, they’d probably still be working on Avatar here a full 3 years after release.  No jokin’.  The fact that this is finally getting pushed out into the mainstream is pretty darn exciting for everyone outside of Weta and ILM.

What does this mean to us groundlings?  First of all, by no means will the words “instant gratification” come anywhere near this post.  This release means things can finally be standardized and the different workflows across software will start to come in line, given another round of point releases or two.  Deep data has been available for a while, and Renderman + Nuke paved the way, but there were still some inconsistencies as other software caught up to what Weta and ILM were pioneering.

Renderers will now formalize support, some faster than others.  (Houdini’s) Mantra, Arnold, and VRay have all had support to some extent already, but then you have a look across the way at Mental Ray and they seem to be lagging far behind and (according to the guys at Hinge Digital) Deep data doesn’t appear to be a blip on the MR radar yet.  At some point in the near future, all will come around to rendering EXR 2.0 rather than dtex or whatever format was being rendered before.

Nuke is the first and only compositing app out of the gate to have Deep technology, and rightly so, having developed the tools directly with Weta and ILM.  Eyeon Fusion will probably get this in there and I bet After Effects will also come around eventually, most likely with this being added to the ProEXR toolset for immediate use with plugins hot for the technology, and eventually by the stock Adobe Z tools themselves.

In Nuke, other than the initial batch of Deep nodes that were released in v6, you’ll see many nodes and tools start to become Deep compatible – for example, you’ll soon see a “DeepKeymix” and nodes like that start to appear as these things pop up in production.  Even the current set of Deep nodes will change, as Dr. Peter Hillman & co out at Weta continually push things forward.  They seem to have made the perfection of the Deep workflow not only a necessity for the coming films, but it’s been elevated to almost “personal mission” status.  With the Hobbit and Avatar sequels looming, this is more than justified!  At some point it will make sense to have ALL nodes be Deep aware in Nuke and for it to be tossed around as easily as a Z channel is now, but that is a ways off and you’ll see this duality exist for a while (Keymix vs DeepKeymix, etc).

nirvana_nevermind_adult

Just like the baby in the “deep” end of the Nirvana Nevermind album cover, Z Channels are all grown up now.

As far as the Deep workflow goes – I love it, but I hate it.  Your first shot with it, you’re immediately hit with the “wow that’s amazing” new car scent as you plug in that first DeepMerge and everything clicks.  But the luster soon wears off when you realize the huge amount of additional processing overhead and network traffic associated with Deep renders.  It may be sweet images, but you take the slow boat getting there!  It’ll bring your system to it’s knees quickly, and your compositing momentum will start to resemble that banana slug you almost stepped on out on your front porch this morning.  You might as well install a coffee machine at your desk, you’ll be taking so many breaks.

Case in point:  on many shots for Man of Steel, I had volumetric cloudbox renders that were up in the territory 500mb-800mb per frame .  This is not a tax bracket you want to be in.  Ultimately, whether you eventually gravitate towards a DeepMerge style of comping or flip it and go with DeepHoldouts, you’re going to want to use the Deep renders to generate your layering and then precomp them out and get them outta the stream as fast as possible, so you can return to “normal” RGBA interactivity and creative flow.  Comps are supposed to be quick – you lighters can keep your excruciatingly slow little render tile windows, thank ya very much.

The hitch becomes nodes like DeepDefocus (currently unreleased, but you can use the Bokeh plugin from Peregrine) and others that are applied further down the tree – and for that, you’ll get to used to dialing values in and then getting them (again) out of your script – and disabling them with the $gui expression.  All in all, the workflow takes some getting used to, but it’s a small price to pay for the flexibility and power of a Z channel that actually works.  And things can only get faster & better from here as they experiment with new levels of downsampling the accuracy and compressing the renders.

The Foundry Creative Specialist Deke Kincaid put out a great collection of links awhile back to help get everyone up to speed on all things Deep.  Digg ’em:

original deep shadow paper:http://graphics.pixar.com/library/DeepShadows/paper.pdf

other must reads:
houdini docs on it:
prman docs on it:
videos on deep image compositing
basic intro one:
Johannes Samm’s Vimeo channel on deep image tools he wrote for Nuke long before we had a toolset on doing this inside Nuke
Rise of the planet of the apes Nuke video:
from prometheus:

The Foundry Releases Assist for NukeX

0

Posted on by

Cat_in_ComputerzzzWith the release of NukeX 7.0v6, the Foundry is including two copies of it’s new Assist product, a stripped down version of Nuke that only “includes tools for the tasks of roto, paint, and tracking.”

This is a value added move to try to make the pricing hit of a NukeX license a bit more easy to swallow for smaller shops.  Historically, companies like Eyeon offered limited versions of their software (in that case, “Rotation” to compliment Fusion) with the hopes of unseating Flame and the  Flame assistant’s license of Flint/Flare/Combustion/Silhouette/AE in commercial heavy pipelines.  On a base level, it makes a lot of sense to parcel these out when even boutique VFX shops have departmentalized paint/roto aside from compositing.  Why have a bazooka like NukeX aimed at a molehill?  And perhaps Diet Nuke/Nuke Lite/Nuke Dime/Nuke Nuked (I could go on…) is a good way to boost the amount of firepower you can throw at a shot, and give the powers that be one less excuse to pony up some extra NukeX coin.

Offering Assist has immediate value for the company pocketbook when it comes to frame by frame type work, but from the artist standpoint there’s not much to know or get excited about here.  Assist is highly crippled and quickly deteriorates for higher level tasks, and as is, there will probably be a juggling act associated with using it in production.  SplineWarp was not included in the toolset, nor were any 3D tools for geometry assisted paint work, which is to be expected – but that’s the bread and butter area of most higher level artists.  In fact, not even the Grade node was included – which, as you can imagine, makes it hard to grab a clone source from another frame or do any sort of relighting to your paint work.  I can’t think of the last paint shot I had that didn’t have a grade node.  Assist can open any Nuke script, and unsupported nodes will render but be outlined in red and their controls grayed out.  Write nodes are disabled in Assist.

For this to have real value outside of a press release, The Foundry might want to rethink the scope of what it’s definition of especially paint includes – but worth noting that this wasn’t beta tested widely and should be considered a v1.0 release.  The Foundry may decide to change what’s offered in the toolset based on initial reaction.  In my opinion, they also have a couple of line items out of whack as far as what’s offered in NukeX vs. regular Nuke, like GPU accelerated rendering.  But hopefully these things will iron out given more time to digest.  Ahhh, whatever…   whaddya gonna do…     it’s “free.”

For more info, catch the press release here.

Nodes included in this initial Assist toolset:

Did someone say Assist?

Did someone say Assist?  Dame can help with that.

Image

Checkerboard ColorBars ColorWheel Constant
Read Viewer

Draw
Radial Ramp Rectangle Roto RotoPaint

Time
FrameBlend FrameHold FrameRange TimeEcho
TimeOffset

Channel
Add Copy ChannelMerge Remove
Shuffle ShuffleCopy

Color
Invert OCIO CDLTransform OCIO Colorspace OCIO Display
OCIO FileTransform OCIO LogConvert

Keyer
Keyer

Merge
AddMix Dissolve KeyMix Merge
Premult Switch Unpremult

Transform
Crop CornerPin PlanarTracker Reformat
Tracker Transform TransformMasked

Views
JoinViews OneView ShuffleView Split and Join
Stereo Anaglyph Stereo MixViews Stereo ReConverge Stereo SideBySide

Metadata
AddTimeCode CompareMetadata CopyMetadata ModifyMetadata
ViewMetadata

Other
Backdrop Dot Group Input
Output PostageStamp StickyNote

An accidental late night sexy text to DPX

0

Posted on by

So everyone gets what VFX is.  And PDX is a no brainer for anyone who’s ever gotten tired of writing out P-O-R-T-L-A-N-D and is fine enough with an airport acronym, and a cool one at that (what with it’s iconic X).  But DPX?  Who the what now?  Have I gone and scrambled my brains at the Driftwood Room again prior to posting?

DPX is my favorite file image format.  And oh how I missed it.

You’ll have to excuse me for a sec while I geek out.  But we’re never gonna survive unless we get a little geeky.

Pulling a DPX into Nuke used to mean you were working on a film.  Hell yeah.  This is something I took for granted; that those tasty dpx’s would always be sitting there, waiting for me every morning.

I’m not sure quite sure when they went away, but at some point along the way, along came EXR’s, and things got all complicated.

Now at my brand new gig, I’m the “end of the line.”  Has a nice ring to it!  And guess what?  DPX is back in vogue.  Allow me to sing it’s praises.

For live action footage, there is no better.  Nuke absolutely eats a DPX for breakfast.

And I’m not talking Portland brunch…   where you cruise into Tin Shed on Alberta, put your name down, grab yourself a cuppa the “free” Joe only it’s not really free because you’ll be billed for it in the event that you wait long enough to finally sit down and your server sees that you have the cup and marks you down for 1 stumptown special, but let’s not even go there yet because you’re still sitting there waiting but you stick it out because it’s like you brought your dog and Tin Shed is like the only place that really allows dogs because they have the outdoor seating and all that but it’s like sunny and so it’s crowded but hey it is SUNday after all so like whatever I’ll just sit here for what time is it? is that even morning anymore?  and wait until they call my name and there’s not really much else within biking distance except hey maybe Juniors is my backup plan but I’d have to tie the dog up out front and the food is good and all but it was kind of dirty in there last time I was there or at least my cup had spots on it and my water was kind of, I don’t know, dusty?  so like I should probably just wait it out but there’s like 3 other 2-tops ahead of me on the list, like when did this place turn into Gravy anyway I thought Tin Shed was like normal but whatever it is like the best thing Alberta has so I better just suck it up.

Ok, that diatribe you just went through?  That was an EXR.  DPX would have you out the door and smiling already while EXR is still loading in scanlines.  If you want to talk in brunch terms, this is like Gravy vs. Equinox – am I right?  Ok, maybe I should explain that ref for any non P-towners who aren’t familiar with our obsessive brunch scene.  Gravy over on Mississippi Ave is hands down one of the best brunches in town.  The best french toast in the city – maybe the galaxy.  But you wait for it.  It’s a mob scene even on a weekday.  In other words, it’s great but it’s the SLOW boat.  However, right around the corner from Gravy is Equinox.  Which is a damn good brunch – up on par with Gravy for sure.  The difference being it’s tucked around the corner, just enough off the beaten path where you can walk right in and sit down immediately.  You’re done and moving on with the rest of your day while the Gravy folks are still standing outside, tapping feet, shooting glances to watches, pondering how much longer before they just…  bail.

Don’t be afraid to bail on EXR when the circumstances permit.  Proof?  You want proof?  Get this – for a plain Jane little 3 node comp render of 50 some odd frames in Nuke (in > blur > out) with footage shot originally with a RED  Epic @ 4k,  here are the stats:

EXR (zip scanline) – 2 minutes 57 seconds

DPX – 10 seconds.

No @$#% kidding.  10 seconds.  I mean, I had a feeling DPX would be faster, just from working with them time and time again…   but I’ll admit that had my head spinning.  I’m not even sure how to explain this because it doesn’t add up – there is nothing apparently different about this machine I’m on…  there’s no DPX equivalent of a Red Rocket card or anything, and no GPU accelerated nodes in my script.  I’m scratching my head – how is this possible?  This box does have an SSD but I was sure to clear the caches before rendering, so it should be apples to apples.  I ran it twice and got similar render times.  The sys admin wizards at Weta Digital might have some sort of caching enabled on Linux here that is somehow locally stashing the DPXs in RAM, but not the EXRs.  Hmmm.  That gives me more questions than answers!  Mental note:  never ever leave Linux.

Anyway, a 2nd opinion is in order, good doctor.  Let’s try something a bit more scientific and I’ll shoot this to the renderfarm.  I’ve switched shots and have a 2k shot that originated on film, not that it really matters once you’ve made a DPX and an EXR.  Kicking these off now and will wait for that juicy stats email…  looking for total time elapsed and I’ll get you a per frame CPU time (minus batch load time).

EXR (piz):  6m 53s – 7 secs

EXR (scanline): 6m 38s – 6 secs

DPX: 4m 10s – 4 secs

This was done on a Sunday – as low traffic and open of a farm as Weta Digital gets.  Different versions were rendered within minutes of each other, and those Nuke renders should have had the blades all to themselves.  The bottom line:  33% faster rendering.

You can talk all you want about how Nuke uses EXRs as it’s internal intermediary format and is floating point across the board.  Bla bla bla.  Look at the results.  That adds up to a heck of a lot of saved time over the course of the day, not to mention the interactive boost while working.  As long as your capture format is less than 16 bit, and if they are shooting digital (or even film), it is most definitely less than 16 bit – you simply cannot beat what I’ll call “integer love.”  The CPU just chews through it.  For grained live action footage shot with most cameras, DPX.  There is no substitute.

The RED Epic, with it’s claimed 11 stops of RAW dynamic range?  No problem.  The Canon 5D and it’s 14 bit sensor?  Easy money.  10bit Log DPX handles as much dynamic range as 16bit linear because, kind of like an mp3 still sounds damn good (or maybe FLAC is a better comparison), log space will put all of the image goodness where it counts.  Your highlights?  Don’t worry, they’re unscathed and intact.

File size you say?  Network traffic?  Also not a problem here.  10bit log DPX was 13mb per beautiful grainy frame in this example (2k live action plate).  EXR/Piz could only come up with 14mb.  That varies, but the time spent unpacking them doesn’t so often a small filesize advantage for Piz can be ignored in favor of performance.

Some people will scoff at a raw DPX and it’s log encoding, saying it looks washed out and unviewable.  First of all, who are these people and why are you letting them anywhere near a vfx pipe decision?  Amateurs!  Ha!  The truth is, an EXR in linear colorspace is just as strange to view in it’s dark raw state.  You’re never going to view either of them raw and are gonna have to chuck a LUT on it either way, so this is a non-issue.  Worry not, Photoshop can still open it correctly.

Don’t get me wrong, DPX is not for everything.  There’s the whole alpha channel thing.  The dpx spec can handle it (as well as rendering up to 16 bit linear) but many of the software, After Effects comes to mind, still seems to be stuck on the old Cineon spec.  Cineon was what DPX was before it was dpx – .cin – and Cineon had no alpha channel support and was locked to 10 bit log.

CG renders especially will use all of the extra range goodness that EXR can offer up.  Over in EXR land, Piz compressed is the way to go for live action plates over slow networks, or large feature film projects that need oodles of storage.  If you can afford larger filesizes, Nuke likes a good scanline zipped EXR much better.  Single scanlines for live action, and often shops will write tiled for CG out of the renderer and then convert to ZipS as a post process.

Really, the big revelation with EXR was embedding multiple passes into one file, and knocking things down to “half-float” for all of the calculation and storage advantages.  Finally, floating point was worth the price of admission.  Historically, Tiff could always do float precision but no one used it because the extra usable range came at such a rendering and disk space cost that it was overkill 99.997% of the time.  In fact, at Weta prior to the EXR revolution, we were using liff (essentially a log IFF) as our primary format for CG renders, for many of the same reasons that DPX works so well.

When EXR 2.0 is finally released, it will up the ante and you’re going to start seeing more standardization of the deep image formatting for CG without holdouts or layering problems that useless per pixel Z passes always had.

But for live action plates (in most cases*), viva la DPX!