Generating meshes from given data for Blender

I have a set of vertices and a set of faces(an index array containing vertex indices). Is it possible to somehow generate a Blender mesh out of those? I need it because of vertex order. I need vertices to use the exact same order.

In Blender … sure. The Blender API allows to create objects, meshes and materials with all the details. I suggest to ask in the Python forum for details.

In the BGE … no - no way.

No need to do it real time, I’ll ask there, propably. Thanks for info!:slight_smile:

You can use bgl or pyOpenGL to create an object.
Look at my OpenGL or tesselation example.
But this object will only exist as longe the bge will run.

HG1, so I can generate object on fly? I only need the shaded visual object. I don’t need it to be an actual object. Can I somehow do that?

How efficient is it?

so I can generate object on fly?

Yes.

I only need the shaded visual object.

If you draw your own object with bgl or pyOpenGL you have to write your own GLSL shader for the lighting (eg. Blenders standard Lambert + CookTorr). You can find it in my OpenGL lighting examples.

I don’t need it to be an actual object.

This sentence makes not much sense to me. You want to make an object but you don’t need an actual object?
You need at least one triangle that can be shaded.
So what you want to do?

How efficient is it?

It is as efficient as OpenGL it is.

I think he means that He doesn’t need the physics of it just the rendering of an object…

Hi, @HG1: I tried to make a simple file: http://www.pasteall.org/blend/42308 with a custom object and a custom shader in openGL/GLSL but this makes weird things at exit. Do you know how to clean properly the scene at exit? Thanks!

Yes, sort of, and no, sort of. I mean that the object shouldn’t be a KX_BlenderObject type(but it also can be, it’s just not important). Basicly what I need is to render an object from set of vertices and faces(vertex arrays). Oh, and also I will have to make UVs.

Actually I wouldn’t mind opposite workflow - generating list of vertices(actually list of vertex position vectors) in proper order, list of faces(using vertex id in the list as the index value), list of uvs and list of edges(again - using vertex ids in the list as references). Maybe this is possible? Cause this would be even better. And if it sorts the vertices in the same order as the obj.meshes[0].getVertex(0, id) would return them, than it’s gone be all I need.

@@youle
Oh there some faults.

  1. getting the program ID (line 7-9) is not necessary. This line are only necessary if you want to grab the program number from an existing object. To overwrite or extend it. if you want to render direct in the viewport you only have to generate your own program number like you have done in line 47.
  2. The shaderStuff() should be called only once. Because you only need to compile and link the shader only once. Then the shader will remain and run in the graphic card. Until you will delete the program.
  3. You only have to call glUseProgram(prgramNumber) before you draw your object and at the end you have to call glUseProgram(0). glUseProgram(0) will tell the graphic cardto don’t use an shader. Otherwise all flowing objects will use the same shader for drawing (your main problem) . In your case calling glUseProgram(prgramNumber) shaderStuff() and glUseProgram(0) at quit will also work, because you only use one program.
    Also you use immediate mode to draw (slow). For modern openGL look at my openGL examples.

@@adriansnetlis

Yes, sort of, and no, sort of. I mean that the object shouldn’t be a KX_BlenderObject type(but it also can be, it’s just not important). Basicly what I need is to render an object from set of vertices and faces(vertex arrays). Oh, and also I will have to make UVs.

Yes that is no problem. Basically there is no KX_BlenderObject if you draw with OpenGL. But you can use one as object reference or an invisible cone for the physic.
I think I have already done everything you need in my openGL examples.

@HG1: Thanks very much for the tips and fixes :slight_smile: . glUseProgram(0) ok! Yeah for immediate mode this is because I find it simple but I have to loose this bad habit. Thanks for other tips!

Finally I made 2 versions: http://www.pasteall.org/blend/42311 http://www.pasteall.org/blend/42312 but unfortunately I don’t think we can use vertex arrays (or vbo) with bgl. So it is still immediate mode… However the second version use something that I didn’t know in the bgl API… Correction: we can use vertex arrays.

OK! Thanks for info. I’ll look at your BA posts. And, as far as I get it, it’s not slower than KX_GameObject, right?:slight_smile:

Also I get that it needs custom shader? If so, I’ve got something very great for this:)

Note - vertices will move relative each to other. Does this mean that I need to recalculate normals each frame? If so, how to do it(note that I have list of faces, list of vertices and list of edges(not used in rendering, used in custom physics, however))?

OK! Thanks for info. I’ll look at your BA posts. And, as far as I get it, it’s not slower than KX_GameObject, right?:slight_smile:

I don’t have done a direct performance comparison, but my terrain tesselation (not released) was running pretty fast.

Also I get that it needs custom shader?

Basically yes. You can use also the old fixed function pipeline lighting. But I would recommend to use a shader.

Note - vertices will move relative each to other. Does this mean that I need to recalculate normals each frame? If so, how to do it(note that I have list of faces, list of vertices and list of edges(not used in rendering, used in custom physics, however))?

Actually I don’t can imaging what you want to do. But it sounds like (terrain?, water?) you have to recalculate it. I would suggest to recalculate the normals in a geometry shader. I think in one of my examples (I think tesselation) I have done it.

It will work faster in shader, right? I’d actually love to do it in shader so that I can keep my main code cleaner:)

For shading I’ll propably use GGX for specular. Also will add retro reflections to diffuse. Propably some clearcoating and, of course, fresnel reflections using schlicks approximation:) oh, and clearcoats, of course.

can you loop througb the vertex of the added mesh, and feed them to BPY using a network socket, then import the resaulting .blend with libload?

This could enable modeling inside the bge…

now, how can we add objects that have physics but no object data? (particles)?

also for physics, dissolve triangle mesh into many convex hulls, would require the ability to add the objects (hulls) witbout making them game objects so they don’t splash the engine performance…(by bloating scene.objects)

Don’t worry about the rest part. I’m working well on it. I just need to have something fancy rendered on screen rather than just a set of blank lines.:smiley:

About performances, in these test files: https://drive.google.com/file/d/0B3GouQIyoCmrX1VpeUJyT2E0c0E/view?usp=sharing
https://drive.google.com/file/d/0B3GouQIyoCmrVDlFSEFzQXpKdUE/view?usp=sharing I compared the performances of an object made in bgl (blender openGL) and of a normal object made in blender. This object is a cube subdivided 7 times.

Of course my code could be improved, but I think objects made with bgl with a custom shader (just a simplified diffuse without specular in this case) without physics can’t be rendered as fast as objects made in blender, with physics and a more complex shader (with specular etc)…

I don’t know why there is such a performance difference (except python vs c++ for the python part) but it is here.