Another week another hope. Any news over this last week that I may have missed?
I would very much like to support the development of this project, it won’t be much at first but it will get your next meal. PM me for details
Based on his twitter feed, Lubos has been working on volumetric rendering, reflections, and preparing the documentation.
That’s about what I’ve been seeing as well. Can’t wait to get my hands on it to give it a test run.
Will this engine use the same workflow and logic of BGE or will it be different?
Still rolling forward! Quite some new stuff popped up since the last update.
@jovlem: Got in touch with Ton recently, I may be giving him a visit by the end of September, but nothing set in stone yet. The talk would be mainly about the new ‘PBR’ viewport as the priority for Blender Foundation is to get the new viewport going, which is understandable. Maybe I can show off some cool Armory stuff to embrace realtime content creation area a bit more and not let Blender fall short here. This would mean easier integration with any tools, not exclusive to Armory(like the mentioned ‘interactive mode’)
@kaufenpreis: Yes, text scripting is the preferred way. Haxe is recommended, Python will also be doable with some limitations currently, as mentioned below.
I was thinking about the Python situation, it is probably number 1 ‘concern’ right now. This led me to some experiments and basic Python scripting is born. For now it works on non-native targets only, eg. in the browser (compiled to JS using Transcrypt library to run as fast as possible) and game player (electron). In the future all targets may get supported, but not before the first release. Some of the reasons behind including Python:
- Python is very deeply connected in Blender, it is hard to ignore this while making engine feel ‘natural’
- There is not enough logic nodes yet to make it a viable replacement for making easy logic
- Less things to learn at once, easier to get started for people familiar with Python
- User can learn stuff at his own pace and transition to Haxe when he feels confident, mix Python with Haxe, compare them side by side
- Simple scripts can be written directly in Blender text editor
As elmeunick9 says, the API is still different. It is mostly just about syntax. With that said, I tried hard to match general naming to Blender, so at least the structure should be very familiar. If it’s called a Lamp in Blender, it is a Lamp in Armory too (instead of Light).
Another challenge that is now partially solved, is keeping game player in sync with Blender scene. Up to now the player needed to be restarted in order for changes to take effect(duh!). There is now basic ‘live patching’ implemented, the scene gets rebuilt on the fly as it is being edited, and changes get reflected in game player (after about a second delay). This means that object moved/edited in the 3D viewport will reflect in the game. Same for adjusting light settings, adding new objects, editing scripts etc. It does not work properly for all material changes yet due to potential need for shader recompile and can get slower for big scenes. But that is all relatively easy to solve. Hope to make a short video about that next.
One more issue hunted down is redirecting print()'s from game player to Blender. If you use print() in a script, it will now show up in the upper left corner, in the info view. Pardon the small size of UI elements on the screenshot below. :spin:
I am still mainly working on docs and examples. Animation support is getting better and better, I hope to render fully animated Blender scenes in real-time. Little example below, with Ballie model courtesy of Tom Bambadil.
Example on rendering camera output to texture, usable for rendering real mirrors/reflections.
Example on the probe system for global illumination. The probes are pre-rendered using Cycles offline, so not as real-time as I would like, but it is still a relatively cheap way to achieve some more accurate indirect lighting and works on all targets.
More material to showcase the renderer, using assets from recently released Megascans by Quixel. Scene composition mostly sucks as I am limited by my poor skills in this area, so loads of room for improvements there.
And with some volumetric lighting.:eyebrowlift2:
@kolas: Missed the mesh blending, thanks for link! Looks handy, putting it on the list of stuff to be done after the first version is out.
I wonder if the bge could raise money to also use your render?
replace the python bindings?(and not really mess with game logic*)
Will this tolerate something like a Intel 4000HD integrated? I am definitely interested in trying it out but I need to know as my computer only has this (I know its a pretty poor graphics card).
It’s more or less impossible for an application to properly support cutting edge graphical effects such as this and support ancient hardware at the same time.
The starting prices for GPU’s that support OpenGL 4 and above aren’t that much really, a standalone device is pretty much a requirement if you want to take 3D seriously.
Loving the idea of having a python stand in the engine, at least for testing r&d so Haxe could become an easy language to learn.
How is LOD and optimization in the engine? Are there any new features for it?
@Ace Dragon, I cant really upgrade my mac’s card, that was the reason I asked (thanks for the advice though :)).
BluePrintRandom: I think it would be ‘easier’ to improve upon what is already there in bge and iterate on that, slowly refactor needed parts etc. The renderer here is written in Haxe to reach any platform easily and is designed to very closely mirror engine workings, to not waste performance. For C/C++ there is likely quite some solutions already available.
Jacob White: As Ace Dragon mentions you will not be able to run it with all the effects pumped up. If you are able to tolerate that, you can just select the forward render path(which does not support some of those effects) and everything should still work fine. Forward path is mostly designed for mobile platforms but can be used anywhere where performance is tight for deferred or hybrid path. You may be able to use deferred path too, but at lower resolution, as it can get memory hungry. I will provide browser demos of the same scene with different settings so everyone can (approximately) see what to expect when running it on his machine.
Nicholas_A: No LOD support for first release(already way too much behind), but will come after that, for meshes and materials.
I finally recorded a new video to show some of the features in action:
- Embedded player directly in Blender to reach the next level of integration (will need a separate Blender build though)
- Asynchronous scene building (will also make it easy to stream big scenes and lod in the future)
- Mirroring operators from Blender to Armory and from Armory to Blender
LOD and Mesh Blending seem to be two of the best candidates for future features to see. I’m saying this from the perspective of someone who doesn’t know exactly what is in the engine yet and by just knowing how cool those features would be to have and that they are currently not there.
5 Minutes of Magic!
Great video, thanks for sharing your progress, lubos, looks very exciting. Realtime feedback, both ways even, looks very useful.
And what about grass bending? You should add support for that if you haven’t already.
Have you tried implementing a Forward+ render ?
I know you can edit your pipeline on the fly right?
Gotta love that lighting magic.