Where to find resources to start with VR / OpenXR in UpBGE?

hello,

I own a quest 2 and had fun with blender 3 with VR scene inspection, works like a charm even with a nvidia1060 laptop.
btw rendering is faster with StreamVR as default openXR app + Virtual desktop than Oculus app + Air Link

so now I’d like to have fun with UpBGE 0.3 but I can’t find any resource to start with as I saw in some nice vids.

I started blender_oculus.cmd sets a XR_RUNTIME_JSON with the right path to oculus json
started the VR session (is there a config to apply here ?) it works
then P : no frame anymore, freeze

what do I missed ? Is there a template script ?
… and does the standalone runtime support VR ?

infos I found so far :

BluePrintRandom https://blenderartists.org/u/blueprintrandom/summary
UpBGE doc https://upbge.org/#/documentation/docs/latest/manual/index.html (nope)
J.Merrill channel https://www.youtube.com/watch?v=Xk6258GADOE (start VR scene insp then P ? not working)
Muxed Reality https://www.youtube.com/c/MuxedReality/playlists (cool but nothing about GE)
testfiles https://github.com/UPBGE/testfiles/tree/master/Python (zilch)
upbge release notes https://github.com/UPBGE/upbge/wiki/Release-notes-versions-0.3.X (just kidding :slight_smile: )
Upbge vr controller documentation? (next step snippet)
so jealous I wanna do it too VR Template in UPBGE

1 Like

What are your goals?

xr_3b.blend (1.9 MB)

here is my dart game prototype but it’s not commented out really yet*

thanks, for this it’s working - no freeze with you file/scripts !
controllers are responding, I can move and play with darts,
but can you move in the scene by moving IRL ?
(aha this is really cool many thanks I will investigate this).

first goal is to “hello VR world” and throw some cubes for now :slight_smile:
did you try it with the standalone player ?

you should be able to use standalone but you need to invoke

bpy.ops.wm.xr_session_toggle()

with ‘tap’ and wait a bit before polling XR using a delay sensor :smiley:

1 Like

Hi again BluePrintRandom, Hi all,
it’s been a while :sweat_smile:

I revive this thread because I’ve been doing some tests over the last few days, now that I’ve got a good gpu for VR, and I’ve scoured the web for a long time looking for solutions, which i still don’t have.

its about to start the VR session directly, when blenderplayer is started with the project file. I modified the file you shared above and tried to invoke bpy.ops.wm.xr_session_toggle() as you proposed above, but unfortunately the player crashes with no explanation.

blenderplayer.exe xr_3b_runtime.blend

xr_3b_runtime.blend (1.9 MB)

start_vr.py will work as expected from blender | alt-P, and sometimes will not crash blender when I run the player | P in 3d view.
just uncomment the loop.

one solution I was considering was to launch the VR session from the VR addon through a startup script, but my bpy is a bit rusty…
and I guess it wont work either for the same unknown reason.

thanks to anyone who could share some pointers about this “run VR directly” need.

I’ve also tried to find information about this topic, but is almost null, the only information I’ve found was in the Blender API Documentation, I’ve had to search and dive into the OpenXR specification in order to understand the mappings, since Blender/UPBGE doesn’t have all the buttons and triggers available.
Also I’ve tried to invoke the VR Session from within UPBGE while Blenderplayer runs, but it doesn’t matter if you’re playing inside UPBGE or blenderplayer, even if you wait for everything to be running correctly and fluid, when you try to start VR Session, it crashes. The solution I’ve found for now is to open a blend file from command line, and run a script that do a pre setup to enable rendered view, disable stuff to get better performance, hide tools and UPBGE windows, etc. and after that, run the script to toggle VR Session, then wait a few seconds to allow the VR Session to start and finally start the game.
It’s not a runtime, but it seems to work and allows me to jump from one blend to another.

Here’s a video showing the “Runtime” alternative I’m using:

1 Like

If you want to support my work I’ve set up a patreon for this project:
patreon.com/upbge_vr

At the moment this is what is coded in the template:

  • You set the initial position and orientation by moving and rotating an object.
  • You can move using point to teleport, and this kind of movement can be with a black fade between points, or can be a fast dash movement.
  • You can move using locomotion by moving the controller stick and jump
  • If you walk in real life to make it stand in the air (Like moving off from a platform) you will fall down
  • You can Jump
  • Rotations and movements gets centered and oriented to your current position instead of the center of the VR location
  • You can grab objects just by adding a property named “Grab” to them
  • You can climb object with property “Climb” on them
  • You can throw object and they will follow the velocity they have instead of just falling (If you tried this already you will know this happens) and also will keep their angular velocity
  • You can choose to make your objects able to grab in any orientation like if you’re trying to inspect them or if you want them to be grabbed in the same orientation always, useful for guns, swords, etc.
  • The movement doesn’t get delayed, since copying locations have a small delay, when you move the hands seems to be behind by 1 frame, and got it corrected using callbacks.
  • The rotation also gets compensated for the same reason, but this didn’t work using callbacks, so it is compensated predicting the next position and rotation based in the current speeds and projecting it one frame ahead to give the illusion that is moving correctly.
  • You can set an object to obstacle by adding “Obstacle” property to it, so it won’t allow your hands or camera to trespass it, avoiding some players to cheat on the levels.
  • You can enable the player to move or if you want it to keep static for some reason like a cinematic moment, by changing a property to True or False.
  • The code runs in parallel in a separated thread.
  • The main code for VR is made only in python. No logic bricks except from the needed to call the script.
  • It has implemented the rays and pointer to make menus in VR easier, you add the property “Menu” to the object, and the way will cast automatically to the object.
  • If you press the trigger, it will spawn an object to collide with the selected option, this way you check for that collision to determine if the button/option was selected.
  • You can use also the index finger to activate the menus
  • You can detect Hand, Fist, or Finger collision from each hand.
  • I’ve implemented a Door setup so you can grab the handle and open/close it like you normally do.
  • This also includes the script that allows it run like a standalone game and change between blend files.

Hi Opheroth thanks for your detailed replies,

yes I red about your work and saw your videos what you’ve done so far looks impressive !

There’s a good chance I’ll be joining your patreon this year.
but atm, I’m overwhelmed by my own project + my job and need to stay focused.

Actually Im looking for a single tiny piece of information - that should be documented somewhere in the API but well.

I tried several things to get around the UpBGE crash but without success.
then i saw your video on youtube where you use Blender to launch the session, and yes its a good idea*, so I give it a try

blender xr_3b.blend -P start_vr.py

start_vr.py:

import bpy
from bpy import context

for window in context.window_manager.windows:
    screen = window.screen
    for area in screen.areas:
        print(area.type)
        if area.type == 'VIEW_3D':
            with context.temp_override(window=window, area=area):
                bpy.ops.wm.xr_session_toggle()
            break

ok nice, this runs upBGE in my helmet.
but after toggle() and some sleep(); xr != None tests I did not find any way yet to simply start the internal player from a script
no bpy.path, found nothing in the bpy, no BA threads, it’s a bit frustrating as the need is lambda. I really think this should be documented.

That would be very kind of you if you just can share the few lines you used to launch the internal player, nothing else,
or to tell me where is this “start player at startup” checkbox :slight_smile:

(* excepted about the Blender licence in some case I guess)

I’m using the next line to start the game. Hope it helps.

bpy.ops.view3d.game_start()

thanks for your reply Opheroth,
I missed this game_start() :roll_eyes:

unfortunately this crashes too with oculus or steamVR. but there is some info about the crash