Blender VR Support test build

Pimax is cheap because they’re using high persistence 60hz displays, which means that despite their higher resolution they still deliver an overall inferior experience due to significant ghosting. Combine the ghosting with the lack of positional tracking (and no neck model in the software because 99% of PC based VR is being made for positional systems, so they don’t need a neck model like mobile VR does) and the likelihood of triggering motion sickness is almost guaranteed.

So, in a lot of your arguments you are missing 1 essential element, motion sickness is extremely different from person to person.
With my game studio we did a lot of research together with a University, and tested a couple of extremes over the period of a year with multiple implementations.
We mostly found that in a very generic group it was about 50/50 if people got motion sick with VR over a period of 10 minutes, testing with a DK2 with Positional tracking and ideal FPS, and about 40/60 on a DK1 with occasional framedrops (though oversampling to 1.6 times).
We also contributed to a second research a year later, focusing more on specific implementation techniques that can reduce motion sickness for people more prone to it, though they also started with a similar result in base numbers, I can send you it if you want, though it is in Dutch.

But on the other end, we also tested on a gamer event (LAN with 1000+ people) where about 350 people tested our multi-player 3rd person action game which used a DK2 without positional and a game controller.
In this group only 11 people got motion sick, and about 10 felt a bit weird during the controller based locomotion.
This makes sense within a testing group like this, since games are more used to this type of experiences and simulation sickness is less a problem within these groups.
If you want the exact results, PM me and I can send you the writeup of the research.

Making statements about ‘guaranteed motion sickness’ is false, though there are elements which can increase the likelihood.

In any case, can we keep this on topic again?
Feedback on the builds please.

can this branch be merged with the blendeVR branch? (VR in the GE)

also I have a google pixel and daydream, I can use riftcat to pretend to be vive or occulus
it’s supposed to be really fast over usb 3.0

I can test for you if you would like to add official support for daydream*

Hey, just found this thread and want to give my 2 cents on the implementation.

I build the last version from source myself, and have been testing on Linux with the Oculus CV1 and must say that it works pretty decent!
I personally don’t miss the lack of Positional tracking and found everything to be pretty responsive (no frame drops, good FPS on my game scenes).
Something like a virtual mouse would be nice, but reading up on the code review on the blender developers page it seems like that is to be added already.

I am definitely keeping this build around, it saves me a lot of time checking my scenes and resources in Blender.
My projects are mostly in UE4 and this makes it a lot easier knowing if my stuff is good before importing them in UE.

that´s right, but as it´s said in that thread, that dll is replaceble with a custom one, so maybe that is not a problem :slight_smile:

now the platform support issue, while that it´s true, I don´t understand why that should be a problem to start a much faster integratoin of VR in Blender, over time there has been some features that were not working in mac or that were less stable, maybe the windows/linux vr feature can be implemented with OpenVR for the completeness (You can also install windows on a mac, so no real limit here) and the mac part can be developed with the current implementation, if the day comes that the current implementation is better than OpenVR everything can be migrated, or the other way around, if the day comes that OpenVR supports OSx, it could be migrated.

if there is no real license problem this could be done that way and we could have better, but I know this is an idea that won´t be welcome I´m afraid XD but I had to put it here, just in case :slight_smile:

Cheers!

Fun fact:

Anyways, apart from what I said, no one knows what do I have to do to have it working with Vive?

Cheers.

I tried the Graphicall version with a Vive. Nothing is displayed in the head display, but on the computer screen I can see that the tracking (only rotation, no movement) works, the left/right views rotate properly.

There was a larger commit on April 13th according to the developer site (https://developer.blender.org/D2133), does anyone have a newer build with this, to test?

I just updated the build to the latest in git. sorry for the delay, i was traveling for a bit

I have a daydream / usb 3.0 and a google pixel

I would like to be a poor mans blender VR tester.

(get devs up and running on the cheap)

Oculus Thrift style. I like. Tried that via cardboard while USB tethered. Didn’t bother with head tracking as previs was facilitated via Shift F and using the viewer like a microscope when needed. Seen examples using a wiimote and glovepie for that though.

Example vid attached. Software used: http://gaminganywhere.org/download.html

Attachments

SBS_Previs.zip (947 KB)

Occulus thrift works best win mixed with a psMove controller apparently.

Also just saw this
https://www.google.com/amp/s/venturebeat.com/2017/02/18/htc-vives-upcoming-tracker-hacked-to-work-with-google-daydream/amp/

Back at CES we met a small creative studio with some big technical plans for the HTC Vive’s upcoming Tracker peripheral.

The team’s name was Master of Shapes, and they were one of the groups that HTC had gathered to showcase some of the many applications for the new device. Specifically, they made a technical demonstration for makeshift local multiplayer VR, called Cover Me. The tracker was attached to a phone, which itself was attached to a gun peripheral, allowing someone to look into a VR user’s world and help them out by shooting enemies. The potential for multiplayer was huge, but it also showcased how the Tracker could be used to make smartphone-based headsets positionally tracked.

At the time, Master of Shapes said that was “fully doable”. Now? It’s been done.

I feel that a stand alone app being put on the oculus store / steam store would be the best option. The 3D view as the main 3D world and then putting a virtual addable removable monitor with the blender3D UI and using the oculus controllers virtual laser pointer as a mouse pointer for the UI.
Assign the controller buttons as common and or customisable shortcut keys. Example: Right hand controller Trigger to grab, A button to scale, B button to rotate, grip button as shift or alt, depress and hold joystick button and then a direction on the joystick to get 4-8 shortcut keys, grip button plus depressing joystick button plus joystick directions to get another 4-8 shortcut keys, grip button plus A or B or Trigger buttons for another set of shortcut keys. This would give 14-22 shortcut keys per oculus controller X2 controllers = 28-44 total shortcut keys. Maybe depressing joystick button pops up a selection UI to choose shortcut key this would be best used for 8 joystick direction shortcut key selection, the joystick shortcut keys UI should disable when no longer depressing joystick button. Depressing Grip button pops up an alternative shortcut key UI for the joystick, trigger, A, and B buttons. Using the grip button and the joystick without depressing the joystick button could be used for another set of 4-8 joystick shortcut keys, this would get 18-30 shortcut keys per controller, but people might get frustrated by accidently choosing an unintended shortcut key when moving the viewpoint.
Same thing with the buttons on the left hand controller.
Joystick without depressing the joystick button moves viewpoint around the 3D environment. Right joystick: move up and down, strafe left and right. Left joystick: rotate left and right, move backwards and forwards.

For the desktop screen I feel that a switchable screen would do best. One screen is the blender3D program so I can use Blender3D on both my desktop monitor or in VR headset at the same time and the other desktop screen option is to have on my monitor what I am viewing in my VR headset when I use Blender3D using my Oculus Rift VR headset.

I really feel the Blender Foundation should get a build team to get this made and sometime soon.

Plus a variant application could be used for Microsoft Hololens.

Lol, accuracy and robustness of Oculus Touch / Vive wands can’t even be compared to PS Move tracking. PSVR has the worse tracking out of the top tree VR systems. I would imagine one wouldn’t want to have inferior, laggy and inaccurate tracking for motion controls and HMD in Blender.

Is this HMD branch still being developed? This would be a great area to expand into, especially combined with the upcoming 2.8 Eevee realtime viewport! At least in the industry I work in there is a large shift towards realtime rendering, with interactive-viewing (eg. with a HMD), as opposed to static renders.

Interesting read.

I think VR must have radical new user interface design, non of this trying to squeeze in the Blender UI into a virtual world.

Trying to convert keyboard+mouse (and fine precision) into HMD+hand controler is not going to be pretty. In my opinion the only solution to replace the keyboard and mouse to access the hundreds of commands these apps have is to use voice activated commands that remove 90% of flapping your hands about.

e.g.
You say “Create”

and a UI list appears with all the voice activated create commands on it. Simply moving one finger back and forth will scroll the active list, no need to aim at it either, it just knows that a finger swipe up means scroll up and a finger swipe down means scroll down.

This list is simple. What do you want to create?

New Scene
Model
Material
Particle system
etc

You can tap your finger or speak the item.

e.g.

You say “Create New scene” or “New scene” if you are on this particular list.
or you say “Create” and then tap your finger to select the first item in the list.

The reason for showing the commands allows users to understand what they can drill down into while they get to grips with voice commands.

As to manipulation in coordinate space, that is direct wand use but could also have voice commands like “move selection 10 100 200”

Just my 2p as I think VR interfacing needs to go in a different direction for business apps for it to be intuitive.

Cheers

you can scale the input so a large movement of the wand, is a small movement of the cursor, and a better dead reckoning hybrid tracking 3d mouse/wand is bound to hit the market soon.

I thought it would just require increased polling but from reading It seems that drift is the main culprit holding back precesion. Rift/Vive tracking has sub mm accuracy but drift occurs making it in-accurate again, the base stations try to stop drift which works great for games and general work but not for something like CAD. No doubt they will overcome it in time as both Sony, HTC and Oculus all have drift issues. Interesting read: https://www.reddit.com/r/oculus/comments/47nbq3/what_is_the_tracking_sampling_rate_for_the/

But, as you say, you could scale movement but you will still get that jittery pointer when attempting any precision and you are doing extra work with your body. Maybe there could be a way to zoom (scale) the world in gently when working at micro movement levels thus making micro movements into slightly bigger movements (whatever the perfect threshold is) to overcome it?

Saying all this, could you raise then move your arms about in the air in front of you for a standard 8 hour work day? I doubt any 3D designer of today could. A couple of hours maybe and you would be knackered. The VR people of the future will look like athletes lol.

Edit: software needs to get smarter to compensate for the VR user. A mouse device for example resists movement so if your hands tend to shake a tiny bit, that shaking is not translated into the pointer as much. The same person holding an air device is subject to muscle fatigue, the shakes etc. So the software has to compensate for these external influences. Predictive snapping could be a solution for some of it. You can test your own stillness with a laser pointer :slight_smile:

hello, any news from that branch ?
I actually get a buffer overflow when activating the viewport (button in the side pannel) with a local build of the HMD_viewport branch (dating april) on mint 18.2 (ubuntu 16.04) so can’t even try it (DK2, Vive and OSVR HDK1 at hand … )… :confused:

OpenHMD will probably get some positional sensing quite soon (they’re working on it : http://www.openhmd.net/index.php/2017/07/13/hackathon-2017-report/) so it would be nice to have blender integrate this into a 2.79 release as advertised in the past :slight_smile: