Most of this is hardware
(ps5 dev kit + fast pci_e SSD)
Basically its using a SSD like giant stick of Ram
So they are streaming raw geometry Into a buffer capable of transforming the chunks, and run a tesselator against the streaming buffer
(I think) from the presentation
It’s not fully raw data , the demo uses materials( so baked textures and UV ), it’s not material per vertex.
I’m pretty sure all next gen games in the making for next gen consoles are using normal maps because it’s most efficient for micro detail rendering, and traditionnal LOD as the progressive LOD without user control will only work with enough high detail on distant meshes to avoid visual glitches.
I don’t think all companies will drop normal maps and LOD using raw data to make only 30 fps games with 2K resolution like the demo (while features like DLSS 2 could help a lot).
Yeah … ffreak unreal and nvidia , … guess what’s real is Blender. Can’t wait for amd to take over gpu also and ubisoft on epic ! if unreal has it’s C++ source open on github, than there is no crack needed to adapt on python ! ! right Ffffred k ?
Iv seen some demos of this guy with some AI following target. Is really cool stuff. It’s a horror game. What use if we cant use that features, that some other game engines have as default !? He talks about an addon - terrain lod camera streaming and stuff … never seen non. Might be tons of resources that people do around, but they forget to use upbge or bge as title so when you search stuff you get right to the target ;). What encourages dudes around here is the fact that if they know python they can create any feature they want, for the rest … it’s a hardcore battle, witch personally i take as a challenge ( as modeling, sculpting, animating, etc!
You’re right, i also noticed some few people using Bge going somewhat more advanced like Mark Telles or this guy does not share their hard work features or you must subscribe to their Patreon.
This really wo’nt help Upbge growing popularity and more users.
This is an issue, i had to search a lot about doing stuff in Upbge and Python and figure how it works, i would have spend ten less time if there was enough tutorials like Unity.
Also no asset store to easy share and centralize all Upbge specific code or addons is also not making things more easy.
I can’t imagine someone coming from Unity to Upbge.
Because you want harcore battle.
You would use Unity providing any tutorials, any addons and simple C#, you won’t be doing battle but making and completing your game instead, it’s a choice.
Coding all things yourself is good for hard core programmers, it’s what they like.
But it’s really bad for non tech game devs whose goal is to complete their games with available 3D engine tools and features, it’s really different.
What i think (GPL License aside), is when Upbge will be release (not beta), to grow popularity it will have to provide enough video tutorials and not let people do it the hardcore battle way.
And to attract users a current gen looking demo is needed.
Whatever the demo uses only one character like Unreal, but top notch character animations and top notch environment, buy all those models or get a 3D artist, but a tech demo that people will find great enough)
(while we know appart from only few, most of them like most Unity users will never make groundbreaking games).
( Because if you make some not so great demo or old graphics Starfox low poly demo, there is always people that find that bad and not looking like GTA or BattleFront 2 game )
no, i mean the opengl context. blender segfaults when i bind the shader because the context is invalid, that means it either changes or is destroyed at some point after the engine starts.
we could simply use the opengl context the bge uses for drawing, that should be enough. though having a separate context for gpu compute on its own thread might be more orderly it could also be a bit trickier to get up and running.
EDIT: actually, scratch that last sentence. using the drawing context would be better for skinning, that avoids having to allocate an additional set of buffers for vertex attributes.
Hello guys, I’m guiding help. I want to use a small cannon to fire balls with the left mouse button. In BGE 2.9: cannon empty: sensor mouse - actuator process object, ball other layer: sensor always, actuator movement setting. All good. But that is not possible in UPBGE 0.3. I was wrong there.
I gave a try to restore old shadows, without really knowing how to do. It seems to me that the new “code ecosystem” is too much different now compared to 2.80. I tried to restore a bunch of code to see if recovering a big part of old “ecosystem” would be enough to have old shadows working but even if I restored old material code and shaders, old lights code and shaders… More than 8000 lines of code, there were issues that I wouldn’t be abled to solve. Here is my failed attempt: https://github.com/UPBGE/upbge/pull/1211 . Maybe there would be other approaches to do the stuff, but I don’t really see how to do as shadows code is everywhere in the code… Then I give up personally. Maybe once interactive mode project will begin, the render code done by blender developers will be more “realtime oriented”.
well it would have been nice but i remember the old shadows was changed because light leaks right? i think there much more important stuff to do than shadows anyway.
how about making particles enable and modifiers without scripts?
but most important for me is the rigid body joints to switch to constraints … still waiting for this one many months now T_T
just wanted to ask what happend that in last few months the new uploads of UPBGE eevee are more rare. its like 1+ weeks now always or more T_T