WebM codec; how to use 'VP8' instead of 'VP9'?

I’m using the WEBM video format so that I can have good realtime alpha video.
It says here that BOTH VP8 and VP9 are available, but I only seem to be able to use VP9.

Our current unity project is incompatible with VP9 and I need to render straight out of Blender.

Or maybe there’s an exe I can hook up to blender to encode this externally?

Anyone have a solution for me?

ffmpeg can encode vp8 with alpha. Here’s the documentation for that:
https://trac.ffmpeg.org/wiki/Encode/VP8

EDIT: unfortunately this solution still requires that you render to an intermediate format out of Blender.

The webm container supports both vp8 and vp9, and vp8 precedes vp9, so it’s odd that Blender doesn’t appear to support exporting video with the vp8 codec. But it’s also odd that Unity doesn’t support vp9. They should both fix this!

You don’t think it’s more odd that the already available API is not exposed like blender foundation indicates on it’s official manual?

If you know how corporate is like, you cannot get all the upgrades you might want. We have our own special version of existing software because it’s a big task simply tailoring and licensing commercial software to suite the needs of the company. It’s things like this that give people ammo to claim that Blender does not have big companies in mind and is therefore less viable to widescale appeal.

I’m campaigning for wider use of open source to ease financial and facility burdens. This is something that can really make a difference to the world I think. The trouble is sometimes an easy win is left un-noticed. I’m simply suggesting we take every road of least resistance available to bring open source more to the corporate world. There needs to be that flexibility to allow retrospective development workflows. If it isn’t a big burden on developers, than these options should be available in order to achieve the larger goal of enticing more companies over to Blender for a bright future.

I honestly don’t expect Unity to retrospectively upgrade their old versions to have that compatibility. Doesn’t matter that they are paying customers. They are less likely to see the value in making a good product before making the richest clients happy on the latest Unity versions. There’s a hierarchy. No. I expect this instead from the open source community, because there is more honesty and compassion.

The opensource community WANT to see their software shape the world for the better.

It seems like vp8 would be easy for Blender devs to enable since it’s already fully supported by ffmpeg.

Until then, it may be possible to execute the correct ffmpeg command using python. Using the handler “bpy.app.handlers.render_complete” you can tell Blender to automatically run ffmpeg on some frames after they’ve been rendered.

Which API are you looking for?

Is there some further tutorials/guides that you would recommend for handling after render python scripts? I’m a complete beginner to python in Blender but I’m very interested.

Which API are you looking for?

Sorry for the misunderstanding, when I say ‘API’ I’m referring to the parameters included on the FFMPEG codec. Specifically referring to that VP8 option.

I’m going to raise this as an issue on Blender Devtalk.

here’s a basic solution to get you started: figure out the ffmpeg command you would like to run immediately after rendering is finished; something that takes the rendered image sequence and converts it to a vp8 video, like:

ffmpeg -i /my/render/folder/%04d.png -c:v libvpx -pix_fmt yuva420p -metadata:s:v:0 alpha_mode="1" output.webm

Then put it in this script:

import bpy
import os

def run_ffmpeg(dummy):
    os.system("THE FFMPEG COMMAND YOU WANT TO RUN")

bpy.app.handlers.render_complete.append(run_ffmpeg)

After running the script, Blender will be set to run the function immediately after rendering is finished. This is equivalent to waiting for the render to finish and then running the ffmpeg command yourself in a separate command line.

1 Like

@Rocketman

I’ve tried to combine your code with what it says to do here:

https://sites.google.com/a/webmproject.org/wiki/howtos/convert-png-frames-to-webm-video

Here’s what it looks like:

import bpy
import os

def run_ffmpeg(dummy):
png2yuv -I p -f 60 -b 1 -n 2400 -j big_buck_bunny_%05d.png > my.yuv
vpxenc --good --cpu-used=0 --auto-alt-ref=1 --lag-in-frames=16 --end-usage=vbr --passes=2 --threads=5000 --target-bitrate=3000 -o my.webm my.yuv

bpy.app.handlers.render_complete.append(run_ffmpeg)a

Am I heading in the right direction, or do those functions need to be called elsewhere?

I might also check this guy out, he seems to have some stuff that might help me:

-S

That png2yuv command and the python script are sort of two different things. You should verify that the command works from your operating system’s command line before trying to execute it from Python. Have you tried this yet?

I can tell you that your Python script definitely won’t work; the png2yuv command must be expressed as a string passed as an argument to the os.system function like so:

os.system("png2yuv -I p -f 60 -b 1 -n 2400 -j big_buck_bunny_%05d.png > my.yuv vpxenc --good --cpu-used=0 --auto-alt-ref=1 --lag-in-frames=16 --end-usage=vbr --passes=2 --threads=5000 --target-bitrate=3000 -o my.webm my.yuv")

Thanks Rocketman, I’m pretty rough at this and I appreciate your patience.

I’m getting this error when trying the script:

(file path censored to for NDA reasons)

File “B:\Blah\Blendfile.blend\Python_WEBM-VP8-After-Render”, line 8
bpy.app.handlers.render_complete.append(run_ffmpeg)a
^
SyntaxError: invalid syntax

location: :-1

It looks like it needs a location? I’ll try to understand how I make the location correction.

A couple of things I think I should have mentioned earlier also:

On windows 10, using the latest build of 2.9

bpy.app.handlers.render_complete.append(run_ffmpeg)a

Well, you definitely shouldn’t have an ‘a’ at the end of that line! What’s the ‘a’ doing there?
Also, keep in mind that if the script runs correctly, Blender will seem to freeze while the png2yuv command is running. It will then unfreeze when it is finished.

Definitely make sure that the png2yuv the command works outside of python; if that part of the script has any errors while you run it in Python, it will be very hard to know what happened.

1 Like

Lol that makes sense.

I got it to run but there’s nothing new in the output folder. Where should I expect the resulting WEBM file?

Also, I hit run, then hit render? or hit render then run the script? I think I need to find the right foundation of information, these are probably some pretty stupid questions from me XD

-S

GOT IT:

import bpy
import os

def run_ffmpeg():
print(5)
filepath = “E://Directory//Directory//Directory//Directory//Directory//Directory//output//”
os.system("ffmpeg -i -f 60 "+ filepath + “IMAGESEQUENCE_%04d.png” + " -c:v libvpx -metadata:s:v:0 alpha_mode=“1” -y -auto-alt-ref 0 " + filepath + “OUTPUTMOVIE.webm”)
print(filepath + “IMAGESEQUENCE_%04d.png”)

bpy.app.handlers.render_complete.append(run_ffmpeg)

#run_ffmpeg()

1 Like

@Rocketman

The next main challenge for me to solve is how to set a variable bitrate and to increase that quality cap.

I’m finding python really difficult however, It’s strange how careful you have to be with spaces and tabbing. Every time I tried to add a variable settings, it seems to break my script.

It may be best to skip the python, since in this case it’s really just a wrapper that exists to run something else. Having the png2yuv command automatically run in python might be convenient if you can make it work, but ultimately it may not be worth it.

Rocketman, are you talking about operating FFmpeg externally?

I have a new working version of my script btw;

import bpy
import os

def run_ffmpeg():
filepath = “E://Directory//Directory//Directory//Directory//Directory//Directory//output//”
os.system("ffmpeg -i "+ filepath + “IMAGESEQUENCE_%04d.png” + " -c:v libvpx -b: 1500k -r 60 -metadata:s:v:0 alpha_mode=“1” -y -auto-alt-ref 0 " + filepath + “OUTPUTMOVIE.webm”)
print(filepath + “IMAGESEQUENCE_%04d.png”)

#bpy.app.handlers.render_complete.append(run_ffmpeg)

run_ffmpeg()

The next trick for me is to find out how to make it point to the user defined filepath in blender’s output settings. Then, get the image and movie names to match the user defined names.
Maybe also, I’ll include the task of checking which temp directory to render the png/exr sequence to. Third step is creating some gui that calls the script in the output panel, and adding the FFMPEG commands responsible for forcing vp8, (libvpx). After all, this is how I wish it was in the first place.

So I’d change ‘WebM/VP9’ to ‘WebM’, and simply have a exclusive radiobox option that says ‘VP8’, and one that says ‘VP9’, and it just by default sits on ‘VP9’ unless the user switches it. When it switches, the options compatible with the other version disappear/reappear as needed.

Can anyone else confirm that the above solution works for them? I’ve tried it in a few versions of Blender but they all fail. When I try it I get invalid syntax errors from these specific quotation marks “ and ” but not ". (If it’s hard to see the difference try increasing the font size in your browser.) If I try to replace those two with the one that works, it breaks the script, I’m assuming because the alpha_mode=“1” part is supposed to be contained within the full os.system line but the order of what is in quotations changes when using ".

Any help?