DAVID-Laserscanner hand in hand with Blender?

Hi Blenderworld , greetings from David’s community

I´m MagWeb, member of the David communityhttp://www.david-laserscanner.com, asking for common interests and aims.

DAVID is a “Definitely Affordable Vision Device”, a laserscanner freeware using a simple webcam, about 65$, and a linelaser, about 70$. With this hardware you are able to get excellent scanning results like this:or [URL=“http://www.david-laserscanner.com/wiki/user_page/sunset_lion”]http://http://www.david-laserscanner.com/wiki/user_page/sunset_lion.

I found some resent posts in this forum, but we are far more now and our results are growing every day.

I think it would be a nice idea to develop a direct contact to blender to make the others job easier… some gateway-scripts to do jobs for DAVID meshes using Blender, and the other way around…
please join and have a look…


Hey Gunter.

The main site appears to be down.

I’ve been playing with david a bit at the start of the project and really like it. The results look really promising. I’ve been wondering about how good/simple the new stitching feature/program is… since I haven’t had the time to try it myself yet.

What sort of thing were you thinking of in relation to blender? The David software already exports obj and blender imports it…

Hello macouno

The stitching tool works… but it´s new and work in progress. It has some problems, specialy stitching back- and frontside of flat objects (like a coin) but there is allways the possibility to allign manual…

Sure, Blender manages*.obj - I did my rendering of the linked examples with Blender. It thought of some special import scripts.: e.g.: DAVID calculates in X+Z+ coordinates and detects this spacequarter. Result: You allways have to rotate the object 45° to get a frontview. To get it in a preset camera you have to move and scale up/down the mesh… All unnecessary work if there would be a script to import files with DAVID-origin.

Users also ask for a simple way to publish their scans as rotating objects as movies. Sure Blender manages this, but could this work not be minished?

But that are only some ideas of a guy working with DAVID and only using (while being fascinated by) Blender.

I think there are some common interests. What are yours?


Well I think adapting the obj importer would be a bit much, but there are things that can help.

You can for instance use the panorama maker script (BGC) to get snapshots from all around your model. There are javascript applets and such that allow web visitors to then look at your model from all sides.

Personally I have a lot more experience with blender than david so this sort of stuff is basic enough for me not to need scripts/help with it. I would personally think a ‘tutorial’ explaining to david users how to make a simple rendering would be more usefull in the long run than automating everything.

Thanks macouno,

for your link: I´m going to give it a try.

I wrote a tutorial in DAVID-wiki. But for I´m far of being a Blender-expert:


Could you have a look on it? Might be to improve.

thanks again,


Looks good in basis… though in stead of putting the camera on a path I would just put an “empty” in the middle of the scene and make the camera a child of that empty. If you then simply animate the empty, and make it rotate it’ll swing the camera around.

ok - I´ve got you.

I visited your link and discovered your scans - the WWW is a real small village.
Just a note to your scan tut.1:
In DAVID Lasercanner:Live camera image at “Proper scan setup” one can recognize that the calibpoints pattern is fixed wavy on the planes. The line is bowed. This affects your result directly. I prefer to fix the pattern on seperate planes that are removed after calibraton. To do this you have to change the calibpoints.dat file in DAVID’s folder (add planes thickness to the X and Z -value where they are equal zero).

I think you scanned with 1.3. version the up to date 1.5 version has improved a lot.

If you try to fuse partial scans as shown on your clay-sculpture page you will fail. Shapefusion automatic mode searches for matching surface areas. For your overlapping areas would be to smal if you use 4 scans only, you should take at least 6, better 8, scans to do a 360° model.


O yeah that was my first result… the second one was quite a bit better. I think it was with the first public version of david… no idea about the version number. I’ll have to find some time to play with david/shapefusion… I’ll have to rebuild my ‘corner’ though since I moved and I don’t have much time now… I hope some more people start doing this stuff… it’s quite fun.