use bge to make a robot?

idk if this has been thought of b4 but i was daydreaming at work and had the idea of making a robot virtually using bge. use bullet to simulate physic and the texture module for video input so it can see. a lot of the assets can be sent to a 3D printer to bring it to life so to speak. has anything like this been Proposed? i think it would make a interesting community project.

I… don’t think Blender is advanced enough for that.
After all, its a game engine; It makes calculations with parameter/property given virtual objects in a virtual space.
And besides the abilty to import motion capture data, there is nothing there to actually translate the real world for the computer…
I’m not saying its impossible… but its going to be impossibly difficult :confused:
Sure is intresting though…

well i was thinking of using a standard pc as the brain and heart of the bot so its power really would just be limited there. i read somewhere about blender being used for robotics some time ago but im not having much luck with google on that. i would probably start with a battle bot style bot to begin with. the video texture module can take web cam input to give it eyes. as far as the bge is concerned if all aspects are focusing on one rig for the most part i think its achievable. but i could be missing something key.

http://www.wikihow.com/Run-Your-Desktop-off-DC-Power

Next you need RAYS = laser range finder

scanning with rays, while rotating slowly, can map a rooms “effective floor”

next you need adding these “ray cast points” to connect together to make a mesh

and making a new mesh in the bge appears to be the biggest issue.

what about using the video input to displace the verts of a vertex parented mesh and use rays to figure the distance. maybe add an other camera to make it like human eyes to calculate the distance.

maybe spawn triangles and move there vertices to the laser points? and then remove doubles in the game engine somehow and join a mesh?

OPENCV - may prove very helpful here.

as well as any open source laser mapping in python.

hmmm what kind of outputs can the bge do?

Ok, I had this same idea, and did a couple tests on a similar thing.

The idea was to use BGE as a sandbox environment for a robot. The output and the input are not the hard part. The easiest way is simple repeatedly saving loading a text file. The harder way is sharing memory between programs, but python can do it.
I used a fairly simple syntax to communicate, with intended speed, intended direction from the external program into blender. And the as output, it was integer values of range-finders, and every frame rendered by the bge (using the save screenshot).

That was all easy enough to do, and I had it done within a few days.

The hard part (that I never finished) was the software to control the virtual robot. Image processing is a little beyond me at the moment.

By had I done that, expanding it to drive a real robot would not have been hard. I used a picaxe microcontroller that responeded to serial commands sent by another python script. The intention was to redirect the video input from BGE with the video from a camera, and the output from BGE into the script sending serial commands. Because python is cross-platform, this could then be transferred to a raspberry pi as the controller, and the whole coboodle shunted into a frame.
I actually have a picture of the intended setup:





This was the test setup. Quite literally an old laptop mounted on wheels. It worked OK, though it was a little heavy for the motors driving it, and suffered from noise/interference over the serial link.


And here is the robot the raspberry pi was going to be seated in when the coding behind it was done. There’s some more on this robot over

Talking about it here makes me kind of wish I had continued. Maybe in these holidays coming up I’ll revive it.

thats awesome :slight_smile: thats just the kind of thing i was thinking of. i was not even thinking of making a real bot tell way down the road lol. just make a virtual robot that is as close to real as it can get to build prototypes. and test ideas before making a real thing.

If you’re wanting to experiment with mechanical design, get familiar with rigid body joints.

what software did u use to control your bot? and ya rigid body joints are on my to do list lol :slight_smile:

Ahem :smiley:

It would be nifty if you could use TorqueWalk :smiley:

Blender walking…

…is it blender yet?

I don’t know it just left :confused:

When the laptop was mounted on the bot it was a python script. This output speeds for each wheel over a serial link using a simple syntax: lt50 means set the left wheels to 50% (stationary) rt00 would set the right wheels to 0% (full backwards). There was also ‘stop’ which removed power from the servos driving the wheels. Not necessary on such a small vehicle, but still fun to implement.
These serial commands were received by a Picaxe20x2 microcontroller (~$5) where a script turned these numbers into standard servo pulses.
All code was handwritten by me, except for python3 itself. But as I said, I never got very far, it was never autonomous, but could follow a preset sequence of movements.

When the laptop was not mounted on the bot, it was driven by a standard remote control, using the Picaxe20x2 to turn forward/left/right commands into the left/right wheel speeds. This was useful for testing the mechanical design of the bot.

This example clearly shows that it is better to have the “control” separated from the “View” (the presentation). This is as the robot does not need a game engine to run. Therefore a strong dependency is nothing you really want to have.

But you can use the game engine to:
A) act as view (showing the robot’s states)
B) act as tester (= “virtual” environment and “virtual” hardware) to test the control

Btw. a robot does not necessarily need image processing. A robot needs sensors of any kind to “sense” it’s environment (e.g. ultrasound sensors as distance sensors). The special on a robot is, that he should be able to deal with unexpected input.

@sdfgeoff: congratulations to your driving bot :). I gave up on my lego robot some years ago. Nevertheless it is a quite interesting topic.

so u made your own circuit board using the picaxe chip and didnt use anything like arduino? i was thinking of using the functionality of the arduino as a model of what to make the output to. in other work design it work with the arduino. :slight_smile:

check this out :smiley:

http://docs.opencv.org/modules/gpu/doc/introduction.html

very nice thats definitely a step in the right direction :). sadly i have an ati card tho :frowning: so i cant really use it.

Monster has it right, control was separated from the input.

The arduino would have been a better choice, because it has native USB support, so you can feed data more directly without having to go through the somewhat less reliable serial channel. I just used the Picaxe because it is what I had.

I suggest if we want to discuss the hardware side of things we move the discussion elsewhere.
But the BGE does have a place, as it provides a nice way to test the mechanical design, as well as to test visual processing in a simpler environment.