Filmic is on EVERYTHING in the compositor (not just render layers)

https://developer.blender.org/D12481

edit: also this one https://developer.blender.org/D12520

will help with some issues, but not all

3 Likes

Oh! That was you! :wink:

Personally, I don’t care because I’m in full ACES workflow but I’m doing all this for my viewers who don’t necessarily have an IO team that will convert, setup and pre-chew everything for you before you even open Blender or Nuke. And even if you convert your plate to EXR, that won’t fix the problem in Blender.

We did! Talk to me on my Discord server. :slight_smile:

2 Likes

Hello guys,

I hope you’re doing well ! Thanks for the video Bob, that was cool to see your question like that in “real-time”.

Sorry to be a pain in the ass, but in your example, you set the IDT to “Output - sRGB”, not “sRGB”. It means that you use the Output Transform as an Input Device Transform. I would say this is a workaround more than a solution. But this is exactly what came to my mind when I saw your video.

So how can we do that with Filmic ? Once again, by modifying/editing the OCIO Config. Here is a little explanation about it.

As you certainly know, you can save a png/jpg from Nuke with “Filmic” embedded/burnt-in directly in the file, like this :

  • Read node in “scene_linear”
  • OCIODisplay set to “Apple P3 + Filmic High Contrast”
  • Write node set to data (Non-Colour Data), so you don’t get “Filmic” applied twice.
    (I had a beautiful screenshot to illustrate this but new users can only upload one media per post).

But what if I just want to set the Write node to the colorspace that I want, which is in this case : “Apple P3 + Filmic High Contrast” ? Well it does not exist by default, so let’s create it in the OCIO Config. I will just open the Config in notepad and duplicate this block of text :

  - !<ColorSpace>
    name: AppleP3 Filmic Log Encoding
    family:
    equalitygroup:
    bitdepth: 32f
    description: |
      Log based filmic shaper with 16.5 stops of latitude, and 25 stops of dynamic range with Apple P3 primaries.
    isdata: false
    allocation: lg2
    allocationvars: [-12.473931188, 12.526068812]
    from_reference: !<GroupTransform>
        children:
          - !<ColorSpaceTransform> {src: Linear, dst: Filmic Log Encoding}
          - !<ExponentTransform> {value: [2.2, 2.2, 2.2, 1.0]}
          - !<ColorSpaceTransform> {src: Linear, dst: Apple DCI-P3 D65}
          - !<ExponentTransform> {value: [2.2, 2.2, 2.2, 1.0], direction: inverse}
    to_reference: !<GroupTransform>
        children:
          - !<ExponentTransform> {value: [2.2, 2.2, 2.2, 1.0]}
          - !<ColorSpaceTransform> {src: Apple DCI-P3 D65, dst: Linear}
          - !<ExponentTransform> {value: [2.2, 2.2, 2.2, 1.0], direction: inverse}
          - !<AllocationTransform> {allocation: lg2, vars: [-12.473931188, 4.026068812], direction: inverse}

And modify it this way :

  - !<ColorSpace>
    name: AppleP3 Filmic Log Encoding High Contrast
    family:
    equalitygroup:
    bitdepth: 32f
    description: |
      Log based filmic shaper with 16.5 stops of latitude, and 25 stops of dynamic range with Apple P3 primaries and High Contrast Look
    isdata: false
    allocation: lg2
    allocationvars: [-12.473931188, 12.526068812]
    from_reference: !<GroupTransform>
        children:
          - !<ColorSpaceTransform> {src: Linear, dst: Filmic Log Encoding}
          - !<FileTransform> {src: Filmic_to_0.99_1-0075.spi1d, interpolation: linear}
          - !<ExponentTransform> {value: [2.2, 2.2, 2.2, 1.0]}
          - !<ColorSpaceTransform> {src: Linear, dst: Apple DCI-P3 D65}
          - !<ExponentTransform> {value: [2.2, 2.2, 2.2, 1.0], direction: inverse}
    to_reference: !<GroupTransform>
        children:
          - !<ExponentTransform> {value: [2.2, 2.2, 2.2, 1.0]}
          - !<ColorSpaceTransform> {src: Apple DCI-P3 D65, dst: Linear}
          - !<ExponentTransform> {value: [2.2, 2.2, 2.2, 1.0], direction: inverse}
          - !<FileTransform> {src: Filmic_to_0.99_1-0075.spi1d, direction: inverse, interpolation: linear}
          - !<AllocationTransform> {allocation: lg2, vars: [-12.473931188, 4.026068812], direction: inverse}

So what are the changes exactly here ? Here they are :

  • Change of the name
  • Change of the description
  • I added the “High Contrast” spi1d Look both in “from_reference” and “to_reference”.

Now, let’s go back to Nuke. I can write the png directly with the Write node to embbed/burn-in “Filmic” :

No need for an “OCIODisplay” node this time. The new colorspace appears directly in the “Write” node. And this finally leads us to our solution… You can now load the png file with this colorspace, just like the ACES workaround you are familiar with.
(I also had another beautiful screenshot to illustrate this… bummer !)

I have checked and it matches pretty well I would say. Please bear in mind that this roundtrip is not lossless and it is just a workaround, not an actual solution.

My only doubt with my mini-tutorial is this line :
- !<FileTransform> {src: Filmic_to_0.99_1-0075.spi1d, direction: inverse}
Should it be this ?
- !<FileTransform> {src: Filmic_to_0.99_1-0075.spi1d, direction: inverse, interpolation: linear}

I have tried both and both seemed to “work”. But to be completely honest, I am not 100% sure on which makes more sense.

Thanks, but I haven’t found anything. I am just the guy who listens and takes notes. The “clip” is actually in the ctl code :

    // Handle out-of-gamut values
    // Clip values < 0 or > 1 (i.e. projecting outside the display primaries)
    linearCV = clamp_f3( linearCV, 0., 1.);

By the way, did you guys know that “Filmic” was used on this short film ? Pretty impressive !

Hope it helps !
Chris

5 Likes

But it’s working :slight_smile:

That’s also a workaround :wink:

Apple P3? Is it the same as DCI-P3? That not as wide as REC2020 or ACES. Why P3?

Nice

That’s cool

Thanks!

1 Like

Hello again,

I would not say that editing the OCIO Config is a workaround. You are basically just writing the colorspace you need in one block. That is a quite important feature of OCIO I would say.

If you wish to modify the OCIO Config like I did, you need to choose two things first :

  • Which “display” colorspace you are targetting ?
  • Which look will you use ?

Only then you can write the colorspace you need in one single block.

I chose P3 because my monitor/display is P3 and “High Contrast” because I like how my renders look with it. Having said that, P3 is also a good colorspace for scene-linear rendering. This is what we used on the Lego movies.

From this link, Apple P3 looks similar to DCI-P3 with a D65 white point. I don’ t know much about it to be honest. I don’t own any Apple product.

The Display P3 color space, created by Apple Inc. This color space uses the DCI P3 primaries, a D65 white point, and the same gamma curve as the sRGB color space.

Cool ?
Chris

1 Like

So explain why you would want to:

  1. Render to a working space that almost no one has?
  2. Render to a working space with imaginary primaries that don’t mean anything?

These are the sorts of basic questions that few to no one is asking enough. There is a massive problem here, but I am very curious as to the reasoning.

How would people feel about me putting together a post for Blender Nation with links to the filmicinBlender, and the Natron and Nuke addons as well, plus a quick writeup? I’ll post the writeup here first for comments, and credit everyone properly.

Or if someone else wants to do that that’s fine by me, but I think SOMEONE should do it.

I would also suggest that everyone posts thier addons to gumroad or github as these are kinda perminant in a way google drive links arn’t always.

11 Likes

That would be public service.

1 Like

Rec 2020 was developed for UHD TVs. Everybody who has a 4k TV… that far from “no one has”. You can’t even buy an HD TV anymore. They are all 4k. If you work on a show for Netflix, Apple TV or Amazon, you will have to deliver in REC2020 as these guys only make 4k projects.

From Wikipedia: During the development of the Rec. 2020 color space it was decided that it would use real colors, instead of imaginary colors, so that it would be possible to show the Rec. 2020 color space on a display without the need for conversion circuitry.

By the way, I’m curious, is that you? https://www.imdb.com/name/nm0811888/?ref_=fn_al_nm_1

1 Like

Except that no monitor/display covers 100% of Rec.2020. None. Closest is the quantum dot display presented by Samsung this week and covers 90% of it.

The Rec.2020 deliveries you are referring to are only a “container”. Most (all ?) of the time they are actually limited to P3 (for display). It was confirmed several times during the ACES VWG by Netflix people.

Troy is actually referring here to the non-physically realizable (imaginary) primaries of ACEScg. It has also been confirmed by ACES people themselves that the choice of AP1 primaries was “complex and controversial”.

And yes, the imdb link you provided is “him”. :wink:

Chris

3 Likes

Sorry, a mistake.

I think Christie achieved 100% of Rec.2020 with one of its laser projectors. The RIT has one of it.

And I just received this email about my previous post :

Our automated spam filter, Akismet, has temporarily hidden your post in Filmic is on EVERYTHING in the compositor (not just render layers) for review.

A staff member will review your post soon, and it should appear shortly.

We apologize for the inconvenience.

I did not remove it…

Chris

Well, when we started to make shows for Netflix and they asked for 4K, we said “nobody has a 4K TV, what’s the point?”. So it’s not because now we can’t display the full range that we won’t be able to do so later.

This is surprising. All his credits are as a grip (on big shows too) but not as a VFX artist. So I’m curious, @troy_s, have you worked in VFX studios too?

Netflix recommends P3 gamut for HDR content (https://partnerhelp.netflixstudios.com/hc/en-us/articles/360000591787-Color-Critical-Display-Calibration-Guidelines)
and even mentions that “Rec.2020 is not used for Netflix deliveries at this time” (https://partnerhelp.netflixstudios.com/hc/en-us/articles/360002088888-Color-Managed-Workflow-in-Resolve-ACES-)
I would be curious to see Apple TVs and Amazon guidelines.

4 Likes

Spam? We’re having a very interesting and polite discussion here. No spam here. I don’t get it.

Post is back. Cool, they were fast !

Yes totally. We have discussed a lot with Troy about how many people/companies want to be future-proof in their color management. It is almost a philosophical question but I think it would be fair to say that there are both pros and cons about choosing Rec.2020 as a working space.

Chris

1 Like

I just ask our IO guy, for The Morning Show, for Apple TV, we delivered EXRs. We did delivered a remastered version of the movie C.R.A.Z.Y with a 4k upscale in REC2020 for the 4k Blu-ray release. But I can ask because we have all the specs for these companies.

2 Likes

At the end, it’s not your call, it’s the client’s. :slight_smile:

1 Like

Hum interesting… I would have thought that the client has control over delivery space, but not the working space. You know what I mean ?

For instance, you could lookdev/light/render in P3 and still deliver as BT.2020 ?

Chris

1 Like

Yeah, sure. The point was that if the client ask us to deliver rec2020, we’ll do it. If he wants EXRs, we’ll do it. If he wants animated GIFs, we’ll do it (but curiously it never happened)

We are finishing a feature film right now and the client wanted 6k! That was totally insane. It was hell in compositing, it was hell on the network and the render farm. Then all the VFX shops complained about it (we’re not the only VFX provider) and they went for 4K. And we learned a week ago the the final delivery would be… guess what… HD!

The day the industry goes 8K, I will retire… I’ll just do my Blender Bob clips and live on my 50$ a month I get from YouTube… :wink:

5 Likes

The small shop I started out in (two people, eight at most) have been doing 8k for a couple years (iirc the NHK still is the main client for that sort of resolution). They just upscale everything from 4k, and it seems to work nicely. Clients don’t need to know, and honestly pictures are crisp… of course I haven’t done any scientific comparison but they seemed flawless at least on the 8k TV they use for grading. But yes, this requires storage and bandwidth to an obscene level.

2 Likes