3D Display for blender! + Webcam

Ok i assume you have all seen this if not watch it now or you wont know what im talking about:
YoutubeVideo

Ive also seen a similar thing done using a web cam to do face tracking, I thought it would be cool if someone could make the 3d perspective view behave in a similar way, that way we could have a 3d view that had depth. I sadly know nothing about anything Ive just said :frowning: but its an idea anyways. LOL
LinktoFaceAPIVideo

I wish i could program just a little. I would have so much fun.

Well, so learn how to program. Where’s the problem? If you really want to do it I don’t see what’s holding you back. I’m learning how to model right now with not an ounce of a visual artist in me. Can’t draw, don’t have a feeling for shape or composition but it’s still great fun and I’m progressing quite well, gaining the feeling for forms and proportions and my skills for using the program and producing interesting visuals grows by the day. So I can’t see why the same shouldn’t be possible with programming. Start small and work your way up. Of course it’s unrealistic to dive into Blender hacking right away but I’m not trying to do Blizzard-Style visuals as a starting project either. Start small, stick to it and work your way up. Code an hour or two a day on small projects that can actually be finished and you’ll get there.
Then again, programming comes natural to me so I might be wrong, but I’d really be surprised if you wouldn’t be able to do it.

There’s a free C/C++ editor/compiler for Windows at http://www.bloodshed.net If you’re on Linux you’re in Programmer’s Heaven anyways (with gcc being painlessly available and a plethora of Open Source editors). On Mac I don’t really know but it should be similar to Linux, I guess, since it’s Unix-based.

The videos you were showing use FaceAPI, which is unfortunately closed source. However, something similar has been implemented using OpenCV which is being developed by Intel and is open-sourced under the BSD-License. Unfortunately the homepage is currently down.

A nice video is here.

However, while this would be a nice toy, I don’t really see the use of it. When I rotate my models I usually do so by a larger angle than would be achievable with moving my head in the viewing area of my webcam. Also the quality of my display (and many other customer TFT monitors) degrades considerably when viewed from the sides and I don’t think people would be willing to wobble around in the chair in front of their monitor just to view the model/slightly from a slightly different angle.

Now making the Blender viewports stereographic - that would rock! However, I have no idea how to do this or if it would even be possible. I wouldn’t have to stare at a flat image of my model anymore, constantly rotating it, wondering if the shapes are any good!

Maybe is just eyecandy, but I thnk is amazing…

Lately I was thinking if it were possible to use cam-tracking/Mocap technology together with a webcam to make input devices like a pen for sculpting. For instance holding a cube (or alike) object in one hand to capture sculpture rotations/movements, and a special pencil (easy to be interpreted by Mocap) to capture the position and direction of the sculpt tool. (I’ll show the idea when I have the time)

Other input devices could be done if this kind of things is possible. And this “Head tracking” thing would be a great complement.

That’s indeed a fun idea and could very possibly be done with OpenCV. There are already digital sculpting pens out there that even provide haptical feedback but they are expensive and bulky because they come with an articulated arm to read the pencil position and provide the feedback.
With OpenCV (or something similar) this could be done with a webcam, and some realy cheap home-made tools (specially prepped “pen” and a cube or whatever).
Pitty that my built-in webcam doesn’t work with Linux. I just might have given it a try… Anyways, such large features shouldn’t be started before 2.5 is here so who knows what happens till then. The internal refactoring of 2.5 should make things like this a lot easier to implement though.

Its nice, but actually i think this technique sucks because you need a wii remote and to mount the receiver on your head.

EDIT: i was not aware of openCV. ^^

But his video gave me an idea… if someone wants to code it:

There are certain points in the human face that are fixed. like the eyes and ears.
Face recognizion is one of the basics in image processing.
So simply put a webcam on your puter. calibrate the system once looking straight at the screen, the system calculates the distance between the eyes and some other facial markers. then based on the images recorded, you can calculate the rotation and position of the head relative to the camera… its almost like augmented reality or face tracking but the other way round.
Based on this rotation and position offsets you can adjust your desktop accordingly.

The disadvantage of my idea is that you have to create user profiles and store the biometric information of all users, while everyone can just put on a baseballcap on the other method.

Sounds easy, but requires hardcore coding and the developing of some decent algorithms. It would be worth a shot though, implementing it. The framework could be done with a simple pattern recognizion used for augmented reality and then implement the face recognizion.

I already played with the AR toolkit seen here:

The opensource toolkit calculates the position of the maker in realtime, creates an according coordinate system and puts a 3d model there overlapping the real video footage. Realtime camera tracking more less. if you reverse this process you can turn and move your desktop according to the marker.
once this works replace the marker recognizion with face recognizion.

EDIT2: and someone already did exactly what i mean:


i guess no one looks at his video because he does not the fancy WII remote modder… ^^

We even have the tools to model a “standard pen/rotator” and share it with the community to do a physical 3D print of it. Affordable and well made. Want your own grip? just modify the .Blend file.

(sorry is this is a bit off-topic)

Gah, that wii tracking is so insanely awesome.

Someone should make a FPS for the wii using that stuff, it would sell no doubt.