Foundry Live 2021 | Empowering Artists with Nuke 13.0

Foundry Live 2021 | Empowering Artists with Nuke 13.0

Show Video

- [Joyce] Hello, everyone. Welcome and thank you for joining Foundry Live Empowering Artists with Nuke 13.0. My name is Joyce, industry marketing manager here at Foundry and I hope you're doing well and staying safe wherever you are in this world. If you're new to our virtual events series, welcome, and I hope you enjoy today's session.

I want to run through a few updates and some housekeeping items before we dive straight into the session. The session will be recorded and available as on demand straight after the webinar and the session will end with the Q&A. So if you have any questions, please add them to the questions tab on the right-hand side of your screen. I'd like to take this moment to thank our sponsors, Lenovo and AMD. We are working with Lenovo, AMD and our other partners as part of our new program to test selected third-party workstation across all of our products.

So thanks to our sponsors, we are excited to give away a ThinkStation P620, a Lenovo and AMD thread ripper system... which includes Nvidia Quadro RTX 5000 to one lucky attendee. So to be eligible for this giveaway, you need to opt in upon registration. If you didn't opt in, no problem at all, you can sign up to one of the three upcoming Foundry Live events and you'll automatically be slotted into the competition.

So you still have time. We started Foundry Live last summer as a way to connect with you virtually and share product updates and release. This year we're excited to come back for part two and our schedule includes a Modo 15.0 release

which is due next week and Foundry's research team will be talking a little bit more about key forces and technologies driving change in the VFX industry which is due on Thursday. And we will wrap up Foundry Live with an education summit end of next week. For more information, and to register please visit our Livestorm page or Foundry events page. This year we are also taking part in several industry virtual events. We recently participated in HPA tech retreat and spark effects, but be on the lookout for Foundry at the virtual FMX, GTC, Real-time Conference and SIGGRAPH. I like to take this opportunity to mention that Foundry is actively involved with the Academy Software Foundation as we use many of their open source projects that are widely used in production.

We have representation on the board, the outreach committee and are actively involved in the technical advisory council and various working groups. So to be successful, the Academy Software Foundation needs community members like you to contribute, code, get involved and to help spread the word about the foundation. So make sure to visit the page and for more information on how to get involved.

Foundry has many learning resources available on our learn page including tutorials, developer documentation and release notes for all of our products. As part of our ongoing initiative to provide valuable learning resources, Foundry's excited to launch the Color Management fundamentals and ACES Workflow Training, created by Victor Perez and Netflix. So for more information on the training material, just join our webinar on March 31st. Stay connected with Foundry by subscribing to our YouTube channel or check out our insights hub where we host a lot of our case studies, articles and industry trends. Or follow us on social media. I'd like to give a massive shout out to Daniel Smith who has recently launched his book Nuke Codex, and we're giving away three books today for free.

So no actions required from any of you. We will pick winners from the audience and send you an email. So stay tuned and keep an eye on your emails.

So with that, thank you all for taking the time to join today's webinar. And I hope that you enjoy it. And if you have any feedback, please feel free to reach out to virtual.events@foundry.com. And now I'd like to pass it over to our speakers. Welcome guys.

- Thanks Joyce. And thank you all for joining us today for this preview of Nuke 13.0, our next major version. So I'm Christy, I'm the senior director of product at Foundry, looking after product management and product design for the Nuke family and Flix.

- Hi, I am Ben, I'm the research engineering manager and the lead on the AI research team. - Hi everyone. I'm Chris Wetherly. I'm an associate product manager working on some different features. Lot of viewing in just at the moment. - I'm Juan Salazar, I'm the senior product manager and I oversee the time and products, Nuke Studio, Hiero, Hiero Player and Flix.

And with that, I think we're ready to start. So let's go for it. [Juan] Welcome to Nuke 13.0 everyone. This release is the next step in our efforts to empowering you the artist and providing new tools to help you create the amazing work you do.

To do this, or a main focus has been on helping to keep your eyes on the images, so you can do what you do best and create amazing facial storytelling. To do this, The first thing we wanted to introduce today is directly related to how you look at your work. In this release, We have extended the monitor out features and unified the system in Nuke and Nuke Studio. This means artists in Nuke can more easily view their work on a secondary monitor even if they don't have access to our monitor art card. Now you have the new floating window and it's independent output called transfer controls so that you can view your images accurately on a separate display.

This means you can have a working viewer and a display viewer set at the same time. For artists working with Nuke Studio, you can now seamlessly switch between the timeline and the compositing workspaces with no disruption to your monitor output. We have falls included new controls and preferences to give you credit options of what and how you want to display it.

An example is the ability to adjust the game and gain of your viewer that you can toggle if you want it to be applied or not to your secondary screen or the ability to have the input process again, applied or not. As well as all the above enhancements for Nuke users, as stability, reliability and usability have all been greatly improved in the Nuke Studio workflow. And we have taken our first steps with HDR workflows directly on the main viewer. With the ability to display HDR images on XDR and EDR enabled monitors on macOS Now, we also perform a lot of creative decisions on the 3D space. So we are happy to introduce the first step in our work to improve the 3D system. We have added Hydra and using hdstorm as the new 3D viewer.

Now you can display a higher fidelity representation of your 3D scenes. This means artist can make artistic decisions inside of the 3D system, rather than always having to switch back and forth with the Scanline render. Hydra supports nearly all existing workflows you are familiar with such as moving geometry, particle systems and projection setups. But the real power of Nuke's new Hydra viewport comes with working with lights and materials in the 3D system. Nuke is now able to display lights, shadows, materials and textures much more accurately than before. Here an example of the current 3D viewer, and the new Hydra viewer and how much accurate it is to the finaL Scanline render.

As I mentioned, Nuke's Hydra viewer is the first in a series of projects we have planned to update the 3D system, addressing both the performance and user experience and what we know that there is a lot to be done. We believe that this is a good first step and we allow artists to be more creative and more efficient when working with the 3D system. But what better way to use Hydra than improving our USD support? In Nuke 12.2, we introduce USD geometry rating in our efforts to reduce pipeline frictions and keeping artists focused on the images.

We are introducing the input of USD cameras, lights and access data into Nuke native nodes. Like the Hydra work, this is only the start in making Nuke a more collaborative space and we are excited to continue the development of USD to allow for greatest sharing between DCCs using non-disruptive workflows. We have also updated or USD version to 20.08 and kept this in line with all the products to make collaboration on pipeline workflows easier. For those waiting to get more out of USD in Nuke right now, all the extensions to the USC notes are being open source so that the pipelines can further extend and customize these notes for the unique USD setups.

And finally, on this first section about looking at your images, we also look at images with our teams. With Nuke 13.0 we are happy to release Sync Review. We actually introduced Sync Review in Nuke 12.2

as a beta feature. As we wanted to support artists who suddenly found themselves working remotely and in need of tools that allow for collaboration of distance. Now we are releasing it with more features needed to enable teams to collaborate and continue working together towards a shared vision of the final image. In the initial release of Nuke 12.2 users had access to the playback controls, annotations virtual systems and soft effects parameters.

And now in Nuke 13.0, which have expanded the ability to sync all the actions needed in a review session. This means users can now make changes in the timeline important footage, create and modify new soft effects. This means you can have prep or ad hoc sessions either to show a simple color change using soft effects or working on the timing of an animation using the real time tools in the timeline. These sessions can be just between a couple of members or with an unlimited number of users seeing the same thing at the same time. We have also enabled annotations on Hiero Player to make sure everyone can communicate together.

Sync Review, like many of our futures is about listening to our users needs and trying to help building tools that allow artists to work better and more efficiently. The next topic is something that I think will change how we do a lot of work, but I'm going to let the person that have led this work to present it to you. So next, Ben it's all yours. - [Ben] Thanks Juan. So I'd like to tell you all about the exciting AI research or air plugins which for my new machine learning tool set available in Nuke and NukeX.

The headline chose called CopyCat and we hope it's going to revolutionize the way you work. CopyCat and NukeX plugin allows an artist to train their own neural network to create a customized effect, specific to the sequence or sets of sequences they're working on. So how does that work? Well, the artist feeds CopyCat just a small set of before and after images to their effect, and CopyCat trains the neural network to replicate the transformation to get from one to the other. It then exports the train network is a .cat file

which can be loaded in our next tool. The inference plugin, which can apply the effect to the rest of the sequence. This way, CopyCat allows the artist to work at scale much quicker. You produce just a few frames the old fashioned way and then you can use the train network on multiple similar sequences potentially even across whole shows or series.

It's important for us to mention that we're not trying to replace artists with machine learning and the results are only going to be as good as the data you feed in. We want CopyCat to empower and assist you to do the things you love and not have to spend time on repetitive, boring tasks. I'm sure you're wondering by now what effects CopyCat can do. Well, we've seen it work with beauty work and clean up, garbage matting, mark removal, fixing focus issues but really it can do any image to image task and we can't wait to see what it can do once it's in the hands of artists' imaginations. So let me show you some examples. First, I'm going to show you how might you might use it for garbage matting.

Let's say we wanted to mask out this man in the desert, the whole shots, hundreds of frames but we're going to Roto out just six manually picking frames where he's in different positions. So as to capture the variation in the sequence. We plug these before and after images into the CopyCat node and hit start training.

Then CopyCat will go off and train the network. Now this might take a bit of time depending on your hardware. So you might want to set up training before going to lunch or going home for the evening. Alternatively, you can just run a new instance of Nuke while it's processing.

And here's the result. Hundreds of frames of pretty good quality garbage maps with just six frames manually rotated in a half hour or so of GPU time. Next up, we have a beard removal example with a Foundry's own version of Henry Cavill. We've painted out his beard in 11 frames.

And as you can see, it's done a pretty good job of applying the cleanup to the entire sequence which is again, hundreds of frames long. So on top of CopyCat and inference, we're also shipping with a couple of pre-trained tools. The first of these is the upscale node which uses machine learning to upscale footage by a factor of two. And here we've competitive to the TVI scale node.

And as you can see, upscale has done a much better job of bringing back high-frequency details and edges. We also have the Deblur node, which removes motion blur from footage. In this sequence, we've stabilized a wobbly shot.

And as we zoom in, you can see that light magic. The bookshelves have been deblurred. As a finer note, both of these pre-trained tools are available with a standard Nuke license. Thank you all for listening. And now I'm going to pass you over to Chris.

- [Chris] Thanks, Ben. Okay. Yeah. So something that I want to start on is how an already much loved tool is coming to Nuke 13.0. And I'm talking about Cryptomatte. So in Nuke 13.0 is now a native node inside of Nuke

and you can find the Cryptomatte node inside of the keyers menu. This implementation includes an updated UI with a new vertical matte list to make viewing your selective mats much easier. The ability to select the Cryptomatte manifest if it's embedded in the input image or contained in a separate Sidecar manifest file.

We also have the introduction of wildcard functionality. So you can perform your selections as usual but if you use an asterix, then you can make more sophisticated matte selections inside of the Cryptomatte node, giving you real flexibility and power in what you're selecting. We've also integrated Python-free support. So it can work with your ongoing productions and having Cryptomatte inside of Nuke really means you no longer have to download it as a third party gizmo. And it means that going forward we can develop this tool further so that artists can keep getting more from the workflows they utilize most. So one we're really happy to see inside of Nuke 13.0.

Okay. So performance. Performance is something that has been at the heart of Nuke development for a number of releases now. And with Nuke 13.0, we really want to continue on this theme. So on average, when rendering scripts, if we just look back Nuke 12.1 was 18% faster than 11.3.

12.2 was 15% faster than 12.1. And Nuke 13.0 is looking to be on average 10% faster than Nuke 12.2. And all of this is with improved thread scaling. So what does this mean for you as an artist? Well, it means that you can see your scripts loading faster when working with things like transforms on complex node graphs, you can see those results faster and also playback.

Playback inside of Nuke because you're dealing with large image data input. It's very difficult to make it real time but even having it perform a bit more is giving you quicker feedback more instantaneous ability to make your creative choices. So as I said, performance with 13.0 is something we want to continue building upon and bring to as well other areas inside of Nuke. Now keeping Nuke as a dependable cornerstone of your VFX pipeline is something we take great pride in. And we know this can only happen if we support the libraries and STK updates you need.

As such with Nuke 13.0, we're continuing our support at the VFX reference platform and for the 2020 platform, this includes quite broad and significant upgrades that sees Nuke using some of the later versions of technologies things like opening XR, 2.4.2 and Python-free which we know how much of a big update this is. We're also including file format and SDK updates for Avid and ARRI. And we've extracted these into separate plugin packages so that you can access these file formats for the release you're working on, rather than only when you upgrade.

So hopefully you've seen some really interesting and exciting features in Nuke 13.0 already but I'm going to hand it back to Christy now, who's going to show you some other things that we've been working on. - [Christy] Thanks Chris, Ben, and Juan for walking through what's new in Nuke 13.0. Nuke 13.0, kicks off a new series of releases and development has already underway for the next Nuke 13.0 release, as well as looking ahead to Nuke 14.0

Of course it will be iterating on the machine learning tool sets, Hydra support and the other features introduced in Nuke 13.0. However, today I wanted to give you a glimpse into some other projects we have lined up for the next release cycles that we haven't mentioned yet today. The first of those is color. So color management and understanding pillar is increasingly important for pipelines and artists.

Most studios and DCCS including Foundry products mow use OpenColorIO as their main color management system. The recently released OpenColorIO version two offers many potential improvements for color management in Nuke. We're most excited about the improved consistency between processing on the CPU and the GPU.

We hope we'll be able to use this to enable a more consistent experience between viewing images and Nuke studio and Hiero, and viewing them within Nuke. We've already been working on integrating OpenColorIO V2 into Nuke. And we hope to have an alpha of Nuke 13.0 with OCIO to support available for initial testing very soon. Improving the experience for working in Nuke, especially in the 3D system and the Nodegraph, as well as the Nuke Studio and Hiero timeline is an area of focus for us for the next releases. So here I have a quick preview of some early designs and prototypes to show the kind of ideas that we're thinking about.

This first one is going to be familiar to anyone that's been in compositing for a while. And that is the ability to shake, to disconnect nodes and the Nodegraph. So here we have disconnecting a single node, disconnecting multiple nodes in a stack together then reconnecting them. You can also select nodes at different locations. So here we're disconnecting these two transforms. And when you use the shake to disconnect it maintains the structure of your graph.

It's pretty cool. The bit in the middle there where it's reconnecting a stack of multiple nodes, this is something that we've actually added in a recent maintenance release of Nuke 12.2. It's Nuke 13.0, so something you can go play with now.

This next one here shows resizing a backdrop node from all four corners. So on the right, we have a example that was built for Mari. We're looking to do the same interaction in Nuke with of course the Nuke style backdrop node. Last but definitely not least, we have the ability to visualize the internals of a group live group or gizmo within a script. The aim here is for artists to be able to quickly understand what's happening inside the group and where it exists in a broader hierarchy.

You know, especially as live groups are being used more. It's becoming increasingly common to have groups in groups in groups, group inception, essentially. So here we have in this prototype, the ability to toggle on and off expanding and hiding the contents of the group.

Here you can see within this visualization you can move the nodes around into a layout that makes sense to you, and that fits within your script. We do know that yeah, everyone could structure their scripts differently. So there is some control over where that visualization is displayed. So here we're going to show toggling in between being on the left side or the right side of the node. And then here we have showing that multiple hierarchy of groups.

So we have a live group with a group inside it and then another group inside that one. And here you can see it quickly and easily it visualizes what's happening at those different tiers of groups. All right, so these ideas are the tip of the iceberg on what we could do for user experience improvements inside Nuke. We have been referencing feature requests submitted over the years, but we want to keep hearing from you on what you need now. So please keep sending in those feature request via support, we do look at them and we hope to use them to influence these projects over the next development cycle. Today, we've been talking a lot about working within Nuke.

And to finish up, I want to talk a little bit about how we're improving the way that Nuke works with other packages. We see Nuke being used throughout pipelines, sometimes as a secondary tool for tasks like Matte Painting is one example, as well as a tool for reviewing work within simple slap costs. So we're working to make it easier for artists across disciplines to work within the context of the final comp so they can work more efficiently and collaboratively. Also, we're saying this is an effort across the products at Foundry and not something that's unique just to the Nuke team. The first place for starting is with the interaction between lighting and comp. In the sister disciplines, you often have artists using Katana and Nuke side by side.

This video shows a proof of concept created by the Katana team. They also showed this yesterday and the lighting and lookdev session. So if it looks familiar, that's where it came from. So here in this video, we have a live render from inside Katana using the foresight rendering system introduced in Katana 4. Pixels as they're rendered are being streamed into Nuke where we have a relatively simple Nuke comp with some color correction known as Antifocus.

When that comp is processed, pixels from Nuke are streamed back into Katana where they create a new catalog entry and can be seen in the monitor. All of this is happening live while you're doing your live rendering within Katana. So as the lighter, you're working more directly in the context of that new comp from within Katana. Again, this is a very early proof of concepts and there's a lot more that the team has planned for making the workflow between Katana and Nuke, more seamless and scalable. So watch this space over the next few months.

Game Engines and in particular Unreal, are being used more in animation and VFX before previs and virtual production, as well as a final pixel render in some cases. So we're looking at how we can streamline bringing data from Unreal into Nuke to accelerate comp and to enable more fast creative iterations between Nuke and Unreal. So this video here shows a proof of concept created by Foundry's research team, bringing data from the Weta meerkat project and Unreal into Nuke. So here, we're going to use the new server plugin to enable a connection between Unreal and Nuke. And then you use the Unreal reader node to choose which map and sequence to view from the unreal engine project.

You can choose either an individual shot sequence or a master sequence. So here you can update multiple actors in Unreal and be able to quickly view the changes in the new comp. So with the Nuke, we can select the render passes to bring from a rail to use in your comp. This gives you the ability to bring in what you need for your work really, as you need it. There are also some advanced settings which allow you to optimize the movie render queue render output for your needs. And lastly, you can create a link to camera to allow you to comp new elements into the scene, visualize the see-through point clouds and composite using new 3D system.

If you want to learn more about the Nuke Unreal pledge as well as what's next in machine learning across Foundry. Definitely suggest that you join in with the research team session on Thursday same time you can sign up in the same place you registered for this one, definitely recommend that you check it out to hear more about these projects and what's next. With that, turn it over to questions. All right, there we are nuclear team and some bonus bonus features for you all. Great.

Shall we do some questions? - Yeah, we have quite a lot and all the messages have been great. So that's good to hear that everyone has got excited with so much stuff. I'm going to start with this one. That was the first question that is at the top of the whole thing. So those types Hydra have real-time grade tracing and RTX support. - Yes, this one's me.

Okay. So I think the key thing about Hydra and what we've added in Nuke 13.0 is Hydra with hdstorm as a viewport renderer. It's one step towards updating the 3D system in Nuke and we'll see more improvements over the next releases but it is important to note with this one that, essentially it improves the experience working in the viewport, and the image that you're seeing.

Hydra and hdstorm it is GPU accelerated, but it's not a real time render. And this also doesn't change the processing and the Nuke's Nodegraph itself, it's really limited to what you see in the viewport. Anything else you'd add to that Juan or Chris? - Nope. - I think that summed up perfectly. - Juan, seen as you were willing to volunteer Christy for a question I'm going to volunteer you, one.

Would you mind clarifying how Sync works? - Yeah, so Sync Review You can connect through a port and IP. So you need to be able to see the machines either on the same network, or if you're in different networks you need to be able to actually see them or through VPNs. And then the footage has to be accessible for anyone on the sessions. So either through a centralized network or everyone covering it in their own machines or kind of in their own locations. And these can be either with, again as many users as you want, or that your network allows you to do it, or is the multi OSS so not everyone has to be on the same OS.

If you are in different OSs, you need to make sure that the pathway mappings are set correctly on each side but that's about it. - Cool, thanks Juan. There was also a question about Osteo 2.0 being included in Nuke 13.0

I can't push the green, because Nigel already answered it in chat. But just for those who are interested. It will be in an outro from Nuke 13.1

but not in 13.0 straightaway. Okay. We have a question from Louis asking, will all these new features come to Nuke Indie? I'm not able to push this to the screen at the moment.

Maybe someone else can. - I think Chris, I think you have to cancel the one that is on the screen. - All right, I clicked minimize rather than don't answer it.

I apologize. Yeah. Will all these features come to Nuke Indie? - The answer on this one is yes. The one kind of caveat is that when you use CopyCat to train models in Nuke Indie, those can, they'll be encrypted so they'll be able to be read in Nuke Indie so you won't be able to use training. Nuke Indie for training that's destined for Nuke, but you should be able to read in the commercial new degenerative cat files and generate your own for use in Nuke Indie.

- Great. Okay. - Great. Okay. Juan's already answering this one in the message but I actually think it's worth pushing to the chain. Does CopyCat have the ability to use distributed calculations amongst multiple machines, or you just tie to one machine? Ben, I think you're best for that. - Yeah, currently you are tied to the machine you're running it on. It's something we've thought a lot about and looked at internally.

It's actually a very difficult thing a lot of the GPU machines a lot of the GPS machines and there are other bottlenecks that come in like the line architecture, looking at in the future. We'll keep updating that. (indistinct) We'll keep updating that.

- There was another question I saw come on the chat. - There was another question I saw come in on the chat. I don't know if it made it to the questions but related. And that was, do you need access to like a cloud-based network and then always on kind of set up when running CopyCat? - No. Cause it's on the machine that you're running on. - Yes. So one of the strengths of CopyCat is it's all contained within Nuke on your machine.

You don't need any kind of outside network to use it. - I had another question which was in that space. Do you have plans to allow supplying custom models to the interference? - We do.

Yeah. I wouldn't like, it's definitely on our roadmap. There will be some caveats to it, probably won't be. All nodes, all models, but and most likely it will be applied to Hmodels. So don't mind which way you're going up by tension by touch (indistinct) - Yeah. So a further think. An important one, actually, I think to note, do the air nodes work with ampere graphics cards? - They do work with ampere graphics cards. There's been a slight caveat in that the very first time you run it, So it will take a bout 20 minutes, half hour So it will take a bout 20 minutes caveate the very first time, just once.

And then it will work. - Is anyone's finding any other questions they want to jump on now? - Since we're on the Roto - Since we're on the water probably we should clarify this, right? They'll say don't put the splines see that image or the output image and probably then... - Yeah, so CopyCat isn't a direct replacement for AI rotoscoping. and we are only positioning it as outfitting garbage matte and we are only positioning it as outfitting garbage mat for our quality roto.

for our quality biotech. That said, Roto is something we're very interested in. And we do have a research project I think it's not Roto, which is looking exactly into this but it's a really hard problem. So I wouldn't, just yeah. just yeah.

a big one with a lot of votes. a big one with a lot of folks. Is Python 2.7 still somehow supported? Migration would be fun otherwise.

- So in Nuke 13.0 Python 2 is no longer supported. It's a full move to Python 3. We do know it's quite a big change and that it may take a little while for studios to move. We are planning to support Nuke 12.2,

including that file IO SDK package, which lets you update STKs outside of a major release. That's going to be supported all through this year to help ease that transition. So yeah, we know it's a big one. - And we have added Sync Review and the annotations exports into 12.2 as well - Yeah, good point. - Yeah, I think good point.

- Sorry, the questions is just full of really good ones. It's hard seeing which ones will come next. So Juan, if you have one, feel free to flag it.

- Sure. One second. I'm having some technical issues for a second. because it's the most basic direct one? because it's the most basic indirect line? When is Nuke 13.0 going to be released? And the answer to this one as you might expect is very, very soon. And in this case, I think we mean very, very soon.

So keep an eye on your inbox and your social media for this release very shortly. - Yeah. We also have one that follows up on some of the questions Ben was asking. Are the models for the AI training shareable? So CopyCat will output a cat file So CopyCat will output a cat file and you can then share that with your colleagues. And so you load that model in the inference node and yeah, you can share cat files around. You can even start your training from a cat file.

So you can learn on top of what someone's learning to speed up your training and build bigger and smarter models. - And then we had a question regarding how big that cat file for the example. I mean the example there, but just in general, how big do the cat files get? - I can't remember exactly off the top of my head, but yeah, they're pretty small. They're like in the low numbers of megabytes, I think 10 megabytes.

- Fantastic. So shifting onto some of the other features we had a question regarding the interrupt between game engine. Will Unreal work with other render engines streaming into Nuke? - That is a great question. So that proof of concept uses Unreal, I think because Unreal is often used quite a lot right now, for us to bring essentially and passes into Nuke.

for us to bring essentially air views and passes into Nuke. That's where we're starting. I think our view on this is, is really it's all about getting data into Nuke and whether that comes from Unreal, or it comes from another game engine or any real-time renderer, ideally our long-term vision is to enable anything that you would want to attach to be able to be attached. - Great. We also had a question regarding the expandable groups. Can we work with nodes in the main group inside of the group? So obviously these are all crucial concepts, Okay. We also had a question regarding

Apple M1 chip support is this planned and if so the potential timeline? - Yeah. So support for Nuke on Mac - Yeah. So support for NukeCalm Mac is definitely important to us. I can see in the chat, there are a lot of Mac fans on Mac fans in this group. We definitely want to keep supporting Mac. We are looking at the Apple and one support.

I mean, there's not a plan for it, you're coming into 13.0, but hopefully it's something we can enable, in an upcoming release. - And along the theme of Mac releases, will you use metal like other software? - It's a very specific question. So I mean, of course like our goal is to get Nuke to run performantly on Mac, and the Apple M1 is a whole new release and the Apple and one is a whole new release essentially operating system is a lot of work. And while we're likely to use some bits of metal from the very start, we're likely to take a path that starts with emulation, then moving to the support later.

It's a very specific answer to that one. - Great. This one's got quite a number of votes. So can the CopyCat no be sent to a render farm? I don't know if we touched on this already, - So no, you can't send it. - So no, you can't spell it. It will be on the machine.

- I think that it's important to note that the CopyCat as the training - I think that is note that the CopyCat as the training is not on the render farm, but if you applied an inference and you're rendering your script, is just a normal script in that sense. - Yeah. That's good. - Oh, this one's interesting. Sorry. Juan, did you want to continue answering? - No, go on.

- All right. Just this one was interesting. Is the connection between Nuke and Unreal two-way? - So that is a great question too. Again, just like super proof of concept to this. You guys should all come on Thursday to get the full rundown on this project.

Ideally, we would be able to push things back from Nuke like from Nuke to be able to manipulate that scene in Unreal and then trigger the render's outreach to Nuke and get that really tight loop without jumping between them, but that doesn't exist to you on that proof of concept today but it is something the team is thinking about for the future or at least the idea of, - If anyone has any others they wanted to flag? - If anyone has any others they wanted to ask? - Yeah. I think since you are on that topic, I'll toss that to send them again. - Anything on virtual production? - This is another, well, actually this and I think if you see the interop with Unreal and I think if you see the interop with Unreal and all the work on viewing Nuke on different devices like there's definitely a thread in being responsive to that. So it's definitely an area that we're interested in. If you're working on a lot of your virtual production projects, we would love to talk with you, about what you need and what we could be doing in Nuke and in other Foundry tools to help you. So definitely an area we're looking at.

It's nothing that we can show you today but we do want to get in touch with you and hear what you need. - I don't know if we have the right people on the call to ask this one, but it was an interesting one they got flagged. Will Cryptomatte date be available for 16 bit EXRs in the future? - Got it. - Don't know this one.

- That is one that I will flag to Anna, who's the product owner on this. So won't be able to answer that one for you now, but I will look out for you, Chris. - Great. What else we got? - Well, that is going back to, could I take that? No. - Yeah. It's hotkey editor. - Yeah, no, that's a different one.

- Don't answer that, the answer is maybe. It is something that we've looked at. There's a challenge in Nuke with how many hotkeys and how many contexts there are to try to let you edit hotkeys. But definitely that fits into that category of all those user experience updates, right. That are on our long list of things that we could do to improve working in Nuke.

So this one's important to you. Well, it's great to have it raised here, all those outputs. That's awesome. But if there are, yes things you want to do with this or to make sure it's top of mind, don't hesitate to send in that feature request to support because we are actively tracking all of those right now. - Oh, I'm just going to flag this one because it's a quick one but someone missed the answer about the release date. It's good to make sure that one hit so.

- Very, very soon. Very, very, very, very soon. Not in like today soon, but very, very soon. - This one's an interesting one from Victor. What about AI grain, both denoise and regrain? - So that's not one of the current pre-trained tools that we've got there that was upscale and Deblur.

It is something you could potentially train to do with CopyCat. It's one thing. And it's something we thought about and looked at doing is another pre-training tool, possibly might make the roadmap going forwards - It seems we're back on CopyCat. - Will CopyCat and Inference need a NukeX license? - No, it will not.

You will need a NukeX license to change the cat file. But if the cat file is already set up in the script then - Yeah. So you need Nuke X for CopyCat itself for training, for setting up the inference. But then from there it can be running a new script and Deblur and upscale. People on upscale are available with a Nuke license. So you can like use them as an art, unique artists can use them without the NukeX.

- We also have one, which was is it possible to use the machine learning for keying? - Again, there's not a tree trench for that but you could use CopyCat working absolutely and similar example properly to the that garbage matting that we show the incident before and after friends. And I'm sure it will learn to pick out the key color . - Oh, I love the enthusiasm for everyone that wants it to release like now. (group laughs) - It's so difficult to pick between the chat and the questions, just fewer texts. I gotta stick on the, well this one's skipped ahead because of the voting.

So now it's now it's going to be answered. Is there a plan to have ZDefocus node working with deep images like Bokeh ? - It is something that we have investigated in different ways. So if you're interested on this one, really send this, talk to support, send this feature request and get it higher up on the list.

But it's part of what we've been looking as part of kind of new UX and part of the new features. - Yeah. And the question I was going to raise is along their CopyCat, could it be used to remove CG render noise? Obviously the interesting difference between live action or live such as noise versus render. And essentially we haven't actually specifically tried that but it's the sort of thing definitely potentially do, if you just render some frames with the AI setting with less words and it said the lungs with the noise. Definitely potentially do that. I'd love to see someone try it.

- We have a question here. There's been a few questions about tracking. So on like this one, any updates on the tracking features in Nuke? - We don't have anything specific on the roadmap at the moment. We have been doing different bits and pieces.

I think CopyCat will be interested in how people use it on certain bits and pieces of what you can try with that. But if you have any features specific to tracking that you're looking for, again will be great to hear from you and get that sent to us. - Yeah. Thanks Juan. - And I think we can put this on.

So it's about Color management and ACES. So yes, we know we are actually doing a webinar quite soon about color management in Nuke. And we are putting some training out on this.

And as we have mentioned as well we are working on the OCIO V2 which we're looking for an alpha very soon out as well. So there is a lot of plans around color management and we understand the importance of that and really working around it. If you are interested again, come to our webinar but now I forgot the date for the webinar, which I'm on.

- I think it's the 31st of March. - And it will be with Victor Perez who is also around on the chat. So it'll be very interesting.

And we can talk all about color. - Fantastic. - Someone's asking if they saw a flip/flop toggle built into the Nuke 13.0 viewer. I think this one might need a little bit clarifying as to what the monitor output stuff does. - So this is on the monitor out.

So using either the floating window or actual monitor card azure a black magic card, one of the options that you have from the viewer is to do a flip and flop. So you can review that on your actual review monitor. That's the one you saw. - We have an off topic one but is that a Yamaha Digital Grand Piano, Ben? (group laughing) - It is Yamaha. It's not a grand, unfortunately. (indistinct) (group laughs) - We'll have to have a concert portion in a future webinar.

- Oh, this one's an interesting one. Will graph state variables come to Nuke? - That is a very exciting one. Such a tricky one to answer. So this is something that we are looking at.

Like we do have, I mean, that groups that groups in groups, in groups, visualization prototype like that came alongside a bit of experimentation looking at multi-shot workflows and that kind of advanced live groups using something like grassy variables. So can't promise when that might come into Nuke and what that might look like but it is something that we're looking at because yes it would be really powerful, right. To have that kind of context switching within Nuke. There's a good question here on future Linux support.

Can we do that one? This one here, Linux. What's going to happen next with Linux? So Nuke 13.0 is going to ship supporting seven to I7 So same as Nuke 12 and aligned with the reference platform. That's kind of where we are. Now, we do know that with center support changing in the future, I think us along with every other vendor in studio there is looking at at the future of Linux and what distribution to support next.

If there are things that you all are looking at that you think are quite compelling and where your studios are thinking of moving. Yeah, definitely. Again, it's important to let us know because we're investigating this now as well. - We have, no that one's been answered. No worries.

We have a question regarding CopyCat and GPU. So works on Nvidia, will it be useful on AMD and macOS down the line? - So you're correct that it does only work on the GPUs and also CPU at the moment. We're very keen to be supported on the GPS and formats.

There's a lot of third party libraries involved. So we are hoping for some third party help along the way to get things running on this. It's something we're very conscious of. - Yes. - There was a question regarding suggested hardware specifications.

Those things will be available on the web page as soon as things slide. So we recommend referencing to the website for that information. - What about the, should we do the Roto paint one? - Yeah. - Yeah. I can pop it up. - So yeah, I think with this, is one of those has been asked quite a bit. It'll be good to get again, we're working on a lot of the UX kind of different bits and pieces like that.

So this another one, but if you can send it to us, that'd be great. The team is looking at all of this, so yeah. - Yeah. Custom brushes is something that we've talked about

with the UX projects. I mean yeah, in this release there's not an update to the Roto paint node itself. And I think in our short-term plan, there's likely not one, but definitely in terms of user experience is something that we can look at improving. All right.

There's so many questions I feel like, and they move up and down in the list. - Yeah. Apologies to those answering questions. We're trying to scroll through, to find ones that will be beneficial to as many people as possible in the chat. Well, there's one here, any chance performance feedback could be made in a little bit more accessible while you're working on a script to tune the larger scripts.

If you're not already testing or using the profile node that typically gives a good indication of the performance of your script. But yeah, there are other, like you can set the minus P flag in your terminal or so if you're not already using those. So those are some things that I find useful when trying to profile my scripts. - I feel I can answer this with the Cryptomatte. Through the Scanline render. So not at the moment is something that we're looking into.

And now that Cryptomatte again, is now native. We're looking at all the ways that we can use Cryptomatte inside the scripts an inside look. So it's part of the things that we've been looking into. If anyone has anything else, I think that's it.

- We only have a couple of minutes left just as a heads-up. So if you see anything really important that you think should be answered, flag it. - Yeah. Well, I can address the pricing one here

really quickly I think. Just we're saying, there's not any pricing changes with this release, I think as always like, we want to make Nuke accessible we've been trying to do that with things like Nuke Indie. We're always trying to listen, to improve how we can, can change our kind of packaging and in how we deliver Nuke to you.

But there's not any changes that we can can talk about right now. - Oh, this one's a nice one from Victor that I'm just going to flag because I want it also. When are you going to be able to doodle on the script instead of making sticky notes? That's a fantastic feature suggestion. Those are the things suggestions we're looking for users, please contact us and send these through. As they're what's helping to inform things like the shake nodes, the group expansion, all that artists focus with that.

So really, really nice suggestion. Okay. With that, it's 7:00. So I'm sorry.

I don't think we could get to everyone's questions. Is there anything anyone wants to answer before we head offline? - (collectively) No. - I think just wanted to give one more plug for the innovation session on Thursday. Like that's the place to go and ask all those questions about the Nuke and Unreal interop and where that's headed as well as what's next in machine learning in Nuke and also efforts like across Foundry. So definitely Thursday night, just sign up for that one. - Yeah. And I think just a big thank you to everyone

who turned up for this and for all your feedback. And this last year has obviously been checking on for everyone. So we really appreciate you being part of the Nuke family and coming and sharing this release with us.

- Yes. Thank you everyone. Really excited to have this one out there.

2021-04-21 04:50

Show Video

Other news