Spout for Blender

Hi everybody

I would like to announce a new addon I just published:

Spout for Blender.

For those not yet initiated:

Spout is a Windows 10 technology that allows applications to share frames - full frame rate video or stills - with one another in realtime. This plugin lets you now painlessly stream EEVEE to all the other spout enabled applications out there.

More infos on the website.

cheers

martin

Hi. Thanks for that. Di you see big issues to receive a spout stream and map to an plane with uv s?

you mean receiving spout inside blender? I was unable to make that work.

Yes.
I was searching for a rendertotexture() to make it possible.
No success for now.

Is it currently possible or feasible in the future to output with a transparent background?

You again? I just installed your OSC add-on! It’s awesome. It seems we’re interested in the same type of functionality. I like that it’s possible to see the viewport overlays as well. That said, is there a way to output a render? A cycle render even? I know cycle is way too slow for real time usage but it could be nice anyway.

Also, like the person before me, I’d like to have the transparency over spout but is that even possible? I guess it’s possible to key out a background color in the software you receive the video feed but that’s not as clean.

the plugin uses this function:

https://docs.blender.org/api/blender2.8/gpu.html#rendering-the-3d-view-into-a-texture

I guess if the color texture has an alpha channel (most likely) it should be passed on via spout.

Hi @maybites

I see on https://spout.zeal.co/ quite some programs or platforms that are compatible with spout, but I am not very familiar with them. Can you give 1 or 2 use-cases of what we can do with Blender, Spout and another platform? How do you use it yourself?

You could use blender for creating realtime visuals or as a content creation tool for real time motion grafix that can be mixed together with other content inside one of that many apps that understand spout (or syphon -> OSX)

in my blender conference talk towards the end are some more applications.

In conclusion: It makes blender available as a realtime content creation tool that works in conjunction with other realtime content creation, mixing and video mapping tools.

2 Likes

Thanks. This is useful. So the plugin renders the viewport as a texture so it can send it over spout. So in theory I could use a similar approach to apply the same view on a plane within blender? I’m very new to blender but this is the kind of stuff that i’m trying to do. Thanks.

I’ve noticed an issue when using two different viewports in Blender. (like if you want a 2nd monitor for the camera view) Two feeds with the same name are sent through Spout and causes flickering. This is not a big deal however.

Do you think it would be possible to continuously render the result of the compositing nodes and send this over spout instead of the camera from the viewport? I’m not even sure if it’s possible to cook these nodes in real time with eevee.

Thanks.

edit:

Very interesting. Thank you.

edit2:
I’ve done a similar project in TouchDesigner using the Vive Pro lighthouses and Vive tracker attached to a camera using the hotshoe mount and it worked really, really well. The tracking was perfect and it’s much cheaper than the tracking system you presented in your talk. (It doesn’t cover the same range however) At the end you mention a cheaper alternative too. I’m curious about it. Your work is awesome.

So the plugin renders the viewport as a texture so it can send it over spout. So in theory I could use a similar approach to apply the same view on a plane within blender? I’m very new to blender but this is the kind of stuff that i’m trying to do.

the link has scripts that actually do this. though it seems to be a bit trickier to make the captured view into a texture again to be used as an object material. if you figure this out, please post it here.

I’ve noticed an issue when using two different viewports in Blender. (like if you want a 2nd monitor for the camera view) Two feeds with the same name are sent through Spout and causes flickering. This is not a big deal however.

this I didn’t test against. not sure if I can find a remedy against this.

I’m not even sure if it’s possible to cook these nodes in real time with eevee.

my understanding is: there is currently no realtime compositing available for EEVEE.

I’ve done a similar project in TouchDesigner using the Vive Pro lighthouses and Vive tracker attached to a camera using the hotshoe mount and it worked really, really well.

I am aware of the lighthouse tracking capabilities. But lighthouse is not really a tracking system like optitrack with a dedicated software that has some nice tools to solve some transformation problems. And there is currently no direct interface available for blender. But of course you could use touchdesigner (or maxmsp, etc) to gather the data, do some needed transformations and then send it further to blender via OSC…

This is not a big deal as it’s a very niche case. I simply figured I’d report it.

That’s unfortunate. I got used to this with TouchDesigner, I was hoping for similar potential. I can always spout different cameras to TD and do additionnal compositing there however.