video encoding issues

I have been having a problem with blende’rs video codecs, no matter what I do all codecs except “avi raw” and “avi jpeg” produce very degraded washed out video and I haven’t been able to figure out why…I render first to jpeg and then compile to video so I know the source images and by extension the setup blend file don’t have that issue

any help would be appreciated


no help? did I ask in the wrong place, I’m still not getting anything to work apart from AVI raw and AVI jpg, all other formats produce ugly washed out video…what am I missing or where do I ask?

The blender codecs that I use that work are MPEG-1 (video), & the audio codec that I founds that works is MP2.

These work using my “Windows Vista based machines” ex. Win7, Win8.

Actually I suggest:

apparently the issue might be blenders new color management stuff oddly I have been reading at alot of threads and cant pin down where to fix color management of an entire scene,blender materials aswell are appending pale and washed out between files I asked about that in another thread a whole ago and has not gotten any response to date

I’ve never noticed anything peculiar about the colors rendering with Blender.

did some research and found that others have experienced it…but the discussions are kinda vague “to me” and I still dont know where to find the Color management stuff or understand why the issue affects all movie formats except avi raw and avi jpeg as well as materials and textures

Shame this thread went unanswered. I’ll do my best to shine a little light on the issues, and hopefully the wise minds can connect the dots to a context-specific solution. If anyone has further questions (as the topic is complex) please feel free to PM / email me. Apologies in advance if I offend anyone indirectly. This post is not intended to be condescending, but I cannot know where anyone’s knowledge begins or ends.

  • All file formats are not equal. A file format may have formal protocols or design aspects that change the data permanently and irreversibly.
  • Crossing libraries and file formats will not be magical. An artist needs to equip themselves with enough color management ability to test and assert their work is transformed correctly. Noticing something is wrong, as you have, is a key first step.
  • Trust nothing. No software can insulate from these issues. Trust nothing and check your work.

The issue you are bumping into is loosely threefold.

  • When going from an R’G’B’ format to a motion picture format, you are often (not always) transforming color models. In this instance, you are transforming from R’G’B’ to Y’Cb’Cr’. That shift has huge implications.
  • When transforming within a relative model to to an alternate relative model, color space transforms must be checked. In this specific case, you are transforming from sRGB primaries to 709 primaries, which happen to be exactly the same. However, merely because your source and destination color primaries are identical does not mean that your software is “Doing the Right Thing.” In this instance, it likely isn’t.
  • When transforming from a relative R’G’B’ model to a relative Y’Cb’Cr’ model, there are additional transformations that take place regarding transfer / tone curves. In this specific instance, assuming it is HD Rec.709 delivery, you are facing a transform of transfer curve from sRGB’s two part formula to 709’s two part formula.

In the first point, your data is changing from one channel per color to one channel of luma (Y’) and two channels of chroma (Cb’ Cr’). This transformation involves scaling the resolutions of the chroma to save data bandwidth in most instances. If you use a test pattern with chroma scaling tests on it, you can see the result; slightly fuzzier edges in colored forms, where your source may have had crisp lines. Some static image formats perform this exact scaling techinque, such as JPEG.

In the second point, both R’G’B’ and Y’Cb’Cr’ models are relative models; the actual colors their data implies varies from file to file. This means that the colors the data values reference for red, green, and blue are different from image to image. Likewise, the Cb’ and Cr’ data values reference arbitrary colors. You need to ensure that the transformation of your red, green, and blue channels is perfectly matched to the Cb’ / Cr’ axis in your 709 delivery.

In the third point, the sRGB transfer curve / tone response curve that is baked into the data (assuming your image was correctly encoded in the first place) is a two part algorithm. That algorithm would need to be undone and correctly transformed into the 709 two part algorithm (they both happen to be two part algorithms, with different results.) Complicating matters further, the 709 specification enforces a scaling of data values that compresses the data from the original range into a smaller range. If you are using a decent player (and gosh knows many aren’t) the data would be scaled back to proper black and white levels.

The best way to handle this complexity is to do as you have; begin with a stills format and assert that it is correct as a starting point.

From there, you will want to use a tool that correctly encodes to 709. FFMPEG / LibAv has a notorious history on this front, and many, many, commercial tools do as well. In FFMPEG’s case, it would often re-encode sRGB primaries to REC.601 primaries, resulting in a hue shift.

Best advice is to Google up on a few of these things and try FFMBC, which is a forked / patched version of FFMPEG and was designed for use in broadcast scenarios. You will want to check on the use of the colormatrix filter, and test your output. A simple method to test is to attach a decent test chart to the header or the tail and inspect it in reliable playback software[1].

Sorry to laden you with all of the seemingly teeny details, but you can imagine that no single application does this well, let alone correctly. There are too many variables and too many contexts for such an application.


[1] You can imagine that playback software can be a nightmare as well. Is it correctly transforming color primaries? Is it correctly scaling playback from the broadcast levels? Is the video card trying to accelerate the playback and mangling the data in some way? And yes, these issues plague mainstream commercial applications, “professional” applications, etc.