Anyways, apart from what I said, no one knows what do I have to do to have it working with Vive?
I tried the Graphicall version with a Vive. Nothing is displayed in the head display, but on the computer screen I can see that the tracking (only rotation, no movement) works, the left/right views rotate properly.
There was a larger commit on April 13th according to the developer site (https://developer.blender.org/D2133), does anyone have a newer build with this, to test?
I have a daydream / usb 3.0 and a google pixel
I would like to be a poor mans blender VR tester.
(get devs up and running on the cheap)
Oculus Thrift style. I like. Tried that via cardboard while USB tethered. Didn’t bother with head tracking as previs was facilitated via Shift F and using the viewer like a microscope when needed. Seen examples using a wiimote and glovepie for that though.
Example vid attached. Software used: http://gaminganywhere.org/download.html
SBS_Previs.zip (947 KB)
Occulus thrift works best win mixed with a psMove controller apparently.
Also just saw this
Back at CES we met a small creative studio with some big technical plans for the HTC Vive’s upcoming Tracker peripheral.
The team’s name was Master of Shapes, and they were one of the groups that HTC had gathered to showcase some of the many applications for the new device. Specifically, they made a technical demonstration for makeshift local multiplayer VR, called Cover Me. The tracker was attached to a phone, which itself was attached to a gun peripheral, allowing someone to look into a VR user’s world and help them out by shooting enemies. The potential for multiplayer was huge, but it also showcased how the Tracker could be used to make smartphone-based headsets positionally tracked.
At the time, Master of Shapes said that was “fully doable”. Now? It’s been done.
I feel that a stand alone app being put on the oculus store / steam store would be the best option. The 3D view as the main 3D world and then putting a virtual addable removable monitor with the blender3D UI and using the oculus controllers virtual laser pointer as a mouse pointer for the UI.
Assign the controller buttons as common and or customisable shortcut keys. Example: Right hand controller Trigger to grab, A button to scale, B button to rotate, grip button as shift or alt, depress and hold joystick button and then a direction on the joystick to get 4-8 shortcut keys, grip button plus depressing joystick button plus joystick directions to get another 4-8 shortcut keys, grip button plus A or B or Trigger buttons for another set of shortcut keys. This would give 14-22 shortcut keys per oculus controller X2 controllers = 28-44 total shortcut keys. Maybe depressing joystick button pops up a selection UI to choose shortcut key this would be best used for 8 joystick direction shortcut key selection, the joystick shortcut keys UI should disable when no longer depressing joystick button. Depressing Grip button pops up an alternative shortcut key UI for the joystick, trigger, A, and B buttons. Using the grip button and the joystick without depressing the joystick button could be used for another set of 4-8 joystick shortcut keys, this would get 18-30 shortcut keys per controller, but people might get frustrated by accidently choosing an unintended shortcut key when moving the viewpoint.
Same thing with the buttons on the left hand controller.
Joystick without depressing the joystick button moves viewpoint around the 3D environment. Right joystick: move up and down, strafe left and right. Left joystick: rotate left and right, move backwards and forwards.
For the desktop screen I feel that a switchable screen would do best. One screen is the blender3D program so I can use Blender3D on both my desktop monitor or in VR headset at the same time and the other desktop screen option is to have on my monitor what I am viewing in my VR headset when I use Blender3D using my Oculus Rift VR headset.
I really feel the Blender Foundation should get a build team to get this made and sometime soon.
Plus a variant application could be used for Microsoft Hololens.
Lol, accuracy and robustness of Oculus Touch / Vive wands can’t even be compared to PS Move tracking. PSVR has the worse tracking out of the top tree VR systems. I would imagine one wouldn’t want to have inferior, laggy and inaccurate tracking for motion controls and HMD in Blender.
Is this HMD branch still being developed? This would be a great area to expand into, especially combined with the upcoming 2.8 Eevee realtime viewport! At least in the industry I work in there is a large shift towards realtime rendering, with interactive-viewing (eg. with a HMD), as opposed to static renders.
I think VR must have radical new user interface design, non of this trying to squeeze in the Blender UI into a virtual world.
Trying to convert keyboard+mouse (and fine precision) into HMD+hand controler is not going to be pretty. In my opinion the only solution to replace the keyboard and mouse to access the hundreds of commands these apps have is to use voice activated commands that remove 90% of flapping your hands about.
You say “Create”
and a UI list appears with all the voice activated create commands on it. Simply moving one finger back and forth will scroll the active list, no need to aim at it either, it just knows that a finger swipe up means scroll up and a finger swipe down means scroll down.
This list is simple. What do you want to create?
You can tap your finger or speak the item.
You say “Create New scene” or “New scene” if you are on this particular list.
or you say “Create” and then tap your finger to select the first item in the list.
The reason for showing the commands allows users to understand what they can drill down into while they get to grips with voice commands.
As to manipulation in coordinate space, that is direct wand use but could also have voice commands like “move selection 10 100 200”
Just my 2p as I think VR interfacing needs to go in a different direction for business apps for it to be intuitive.
you can scale the input so a large movement of the wand, is a small movement of the cursor, and a better dead reckoning hybrid tracking 3d mouse/wand is bound to hit the market soon.
I thought it would just require increased polling but from reading It seems that drift is the main culprit holding back precesion. Rift/Vive tracking has sub mm accuracy but drift occurs making it in-accurate again, the base stations try to stop drift which works great for games and general work but not for something like CAD. No doubt they will overcome it in time as both Sony, HTC and Oculus all have drift issues. Interesting read: https://www.reddit.com/r/oculus/comments/47nbq3/what_is_the_tracking_sampling_rate_for_the/
But, as you say, you could scale movement but you will still get that jittery pointer when attempting any precision and you are doing extra work with your body. Maybe there could be a way to zoom (scale) the world in gently when working at micro movement levels thus making micro movements into slightly bigger movements (whatever the perfect threshold is) to overcome it?
Saying all this, could you raise then move your arms about in the air in front of you for a standard 8 hour work day? I doubt any 3D designer of today could. A couple of hours maybe and you would be knackered. The VR people of the future will look like athletes lol.
Edit: software needs to get smarter to compensate for the VR user. A mouse device for example resists movement so if your hands tend to shake a tiny bit, that shaking is not translated into the pointer as much. The same person holding an air device is subject to muscle fatigue, the shakes etc. So the software has to compensate for these external influences. Predictive snapping could be a solution for some of it. You can test your own stillness with a laser pointer
hello, any news from that branch ?
I actually get a buffer overflow when activating the viewport (button in the side pannel) with a local build of the HMD_viewport branch (dating april) on mint 18.2 (ubuntu 16.04) so can’t even try it (DK2, Vive and OSVR HDK1 at hand … )…
OpenHMD will probably get some positional sensing quite soon (they’re working on it : http://www.openhmd.net/index.php/2017/07/13/hackathon-2017-report/) so it would be nice to have blender integrate this into a 2.79 release as advertised in the past
the solution for drift and fatigue is a solid mechanical exoskeleton with force feedback as well as ‘weightlessness’
no one is making them yet.
Is there any news on development? I recently started with gravity sketch and tried Oculus Medium, but I really would like to stick to my primairy weapon of choice for designing
I use Blender to design ships/yachts and would love to have the ability to do this in VR with my roomscale rift setup. Looking at Verto Studio VR it has to be possible to get Blenders’ UI into VR. (I’m not a developer, so it’s an easy thing to say probably )
i own a cv1, a dk2 and a dk1
i would like to test the ability to use hmd with blender (cv1 preferred).
i have currently installed official blender 2.79
i am not able to see any option notified in this thread.
i am not able to download the windows release noticed in the first post : returning a “network error” after some Mb downloaded.
do you have any procedure to be able to test openhmd with blender on today ?
- do we need to compile a special code ?
- do we need to download and run a special release (if yes, do you have a working url ?) ?
- is it already included in 2.79 or 2.8 ? is it a plugin to install ?..)
GraphicAll seems to be having issues lately, not my site, so i have no control over it.
easiest is probably just hope graphicall gets it together and download a build from there
You could build it your self if you want, that being said, we have done a bunch of lib upgrades since the last update to the openhmd branch, i’m not sure if it will currently build or not without too much trouble, if you have not build blender before, this is probably not a good option.
graphicall works now
a) if it doesn’t come through at first, try again
b) if still doesn’t work, use download manager (JDownloader2 or similar)
note: add ‘/download’ ad the end of the address (eg. http://graphicall.org/1212/download)
c) mirror link (in case all above fail): 12258_blender-2.78.0-git.792f0aa-windows64.7z (@zippyshare)