Blender + Robots + Real-Time Interactivity = Awesome!?

Hi everyone, I just posted a request for help on this project in the technical support section, but I thought I would start a discussion here about the project in general because I think a lot of people might be interested.

First off, I am a mechanical engineer who just discovered Blender a month ago and I’m in love. This is really an impressive and powerful program and I am totally blown away by its quality, integrated software packages, nearly endless features, and I still can’t believe it’s free and open source. I’ve been silently, (and frustratingly) slaving away on tutorials and googling for information, but today I’m breaking the silence and reaching out to fellow Blenderers.

My goal is to use Blender as a simulation and visualization tool for professional engineering and hobbiest work, and university/highschool level mechatronics/computer science education - with an aim at interactive 3D physics simulations running simultaneously with real world hardware.

I’ve used expensive and proprietary software tools before to do mechatronics/robotics projects, but I love the idea of anyone being able to pick up what I’m doing and learn from it (or teach me from it). It seems to me - and I’ve seen the Blender for Robotics people agree - that Blender is a fantastic framework for all kinds of 3D work, and has a great development group to boot.

I’m starting off by trying to recreate an updated version of Justin Dailey’s “Real-time robotic arm controlled by Blender” project. http://justindailey.blogspot.com/201...botic-arm.html

I haven’t seen anybody try this project in Blender 2.5 yet. Since the API has been reworked I’ve had to start from scratch using Justin’s code as a guide. I’m at the point where my only major stubling block is getting Pyserial to work on a 64 bit Windows 7 machine - but that’s a topic for the troubleshooting forum.

I am also going to use Blender to control more than just RC servo motors, I have plans to do SLAM (Simultaneous Localization and Mapping) with some mobile platforms, run and simulate a 5 axis Scorbot ER-1 robot arm with encoder feedback dc motors, and a lot of other interesting things.

I know the Blender for Robotics project is working on improving Blender’s capabilities for more stable interactive IK and easier hardware interfacing (among other things), but I’ve had trouble finding up to date information on what’s going on. (I’m on the mailing list but haven’t heard any news yet.)

Some of the things that Blender could make very easy to do include dynamic modeling and refinement of complex control systems (things like Segways and walking robots), using physics simulations of an environment as a tool for AI decision making in the real world - in real time, building dynamic real-time 3D models of an environment using limited hardware and sensors. Many of these things are being done right now, but require very expensive and specialized software - but Blender already has fantastic real-time geometry handling, and many of the difficult problems that have been solved in Blender by video game developers, professional artists and animators, and contribution physics/computer science researchers, are extremely useful in the fields of engineering simulation and prototype development for tomorrow’s technologies.

Sorry for rambling on a bit, but I just wanted to get all these ideas out in the open and I thought this forum would be a good place for an up-to-date discussion :).

Any thoughts, suggestions, beratments?

Thanks,
Curt

Can’t offer you any help or pointers, but this sounds very interesting and I’ll be keeping a close eye on this!

This is Super Cool. More than just Awesome.

There is a Project called MORSE (http://www.openrobots.org/morse/doc/latest/morse.html) which has some really decent stuff already working.
It is based on Blender and features connection to several middleware, sensors and actuators (with modifiers). It is fully Open Source and uses Blender to simulate a virtual world. Data is acquired like in real world.

At the moment only Linux is offically supported, but it also runs on Mac OS X and BSD.

Maybe this could save you some time, although it is currently Version 0.4

Thanks for taking time to read through my long rant! And thanks for the MORSE suggestion, I’ve taken a look at MORSE before but I only have a windows 7 system and have not used Lynux before. I’m seriously considering it though based on the amount of incompatibility problems I run into on a daily basis. I may not be able to since I use a company computer.

Anyways, I found a work around for my problem and learned how to use client/server sockets to rout information back and forth from Blender/Python to a script running in Processing. From there I can easily use Processing to communicate with the Arduino. It looks like this might be the only workaround untill I can get Pyserial to run on 64 bit windows and/or it starts supporting it (if it doesn’t in fact already and I’m just a moron lol).

So far I have a 6 DOF Kuka robot arm (downloaded from kuka’s website) rigged with an armature and IK constraint targeted at an Empty coordinate system. I’m using the Blender Game Engine to control the position and rotation of the coordinate system with the Logic editor and keyboard commands (A,S,D,W and Up, Down,Left,Right, C, Spacebar). Every time the Logic loop is processed an “Always” actuator with a boolean True calls a Python script that extracts the 6 joint angles I want, creates a TCP client, connects to the Processing server which can then output to the serial port and tell the Arduino what to do.

Later I’ll create an open client script that is called on game run and a close client script that gets called when the game is terminated so that I don’t waste time connecting and disconnecting to the server. Even how slow it is I’m still getting a output of all 6 joint angles as floats (way more than enough digits) about 60 times a second.

I’ll probably test the code with some RC servos on Monday and let you know how it goes. Once I get a lil better scripting I’ll post the .blend file too so you guys can play with it.

One other problem is that the IK (inversek kinematics) solver is unstable in a non-keyframe solution, I read somewhere that this is because it acts like it is initializing the IK solution every time it is called outside of a keyframed animation and doesn’t take into account the previous position. This causes the armature to “jump” discontinuously whenever you move the arm through or anywhere near a singularity - (a position that has multiple simultaneous possible solutions.) I’m going to try adding more joint constraints (limited rotation travel) and see if that helps. I’m also going to try freezing one of the degrees of freedom since my real robot arm only has 5 anyway and this should increase stability somewhat.

Have a good weekend!
Curt

Welcome aboard, Curt. Great to have you here. I’m always happy to see engineers falling in love with Blender. I’m not particularly technical (english major) but my dad was an engineer, so I’ve always liked the type.
Can’t wait till you start tinkering with Cycles…
I’ve played with the game engine in the past. I’m interested in doing space animation, so I created a flight sim to control the little space fighters and starships in my movie, to give them a convincing zero-g movement. However, since there are a lot of them, and since the BGE records a lot of key data to the graph editor, it all starts getting really sluggish pretty quick. This may not be the best way to make a sci-fi movie… :smiley:
Anyhow, you’re not the only one who rambles in posts. I’m happy to see someone from the technical side of things do it too. I was beginning to feel guilty…
But this is a very cool project you have going here, and I wish you the best of luck. Hang with Blender; as you said, it’s an incredible package, and Ton is a saint for giving all this power, for free, to people who never would have been able to afford to pursue CGI filmmaking, art, game creation, and engineering and physics simulation otherwise.
And having experienced technical people, especially mechanical engineers, involving themselves in Blender deeply is a huge benefit to all of us. I hope you’ll stick around and add your abilities to the community in the future.
Again, welcome aboard. Glad you’re here.

MORSE has never been tested on Windows (afaik), but should run. All components are platform independent so it should work. Maybe you have to change some default paths, but that shouldn’t be a big problem

Thanks again everyone for the feedback and warm welcome. I’ll be using a much larger and harder to interface robot arm soon, but for now I’ve uploaded a video of my proof of concept RC mockup running:

Problems Solved:

I changed the IK mode from “Animation” to “Simulatin” and that made a world of difference in stability (didn’t realise there was that option). Arm movement is now really nice and smooth

Wrote the Python script as a collection of methods with some initialization up top. Now I only connect when the game engine starts and the data throughput is awesome (way beyond 60/sec).

Found a great set of servo control programs for Processing and Arduino to run these RC servos over the serial port - results are great check out for yourself!

Thanks for the nod to MORSE! - I took a closer look at their website and really liked what I saw. I’ll be testing their code on my machine in the very near future.

Awesome work! Any tutorials in the works? I am very interested in using this so keep us posted!

It’s possible that I may be doing some kinetic sculpture/robotic work for a project. If so, visualizing and even controlling the motors from Blender would be a huge asset and would make things much easier. I’d love to have your input down the road, so let me know if that’s something you’re interested in.

Hi Curt,
I am a graduate student and I have a Grad project related with SLAM. I have absolutely no idea how to implement SLAM using an ER1 and other sensors. I read your post about SLAM and thought probably you can help me out… Can you suggest something how I shall proceed…
Ashish