so ive noticed graphics cards have been getting powerfull and ive been using blender for 5 years now. most of the stuff ive seen in the blender rendering system have been used in other game engines. now i may not be a hardware proffesional or skilled at programing. (as you can see i cant even spell without a spellcheck) but i do know is 3d games take alot of math, and the more math it has to do, the more computer power it takes. so even though we have good hardware/skilled people to program this app. why havn’t we merged the rendering system with the game engine. i would love to talk about this more since im intruigued on making blender better.
I think that the main reason that development SEEMS slow, is that many of the developers are volunteers (For the game engine mainly).
The candy branch is sort of a preview of things to come to the blender game engine, and there is some amazing progress there. (SSAO etc.) Overall I think that blenders graphics are weak because the user is “weak” While the game engine may not have the raw power of many commercial game engines (Cry engine, UDK) It is open source, and vastly powerful when compared to open source engines. And then there is the fact that blender has the molding and animation all built into the same program, which really speeds up development.
To get games with better graphics, it may be better to take the graphics abilities that the game engine has, and add them to the GUI.
i.e bloom filters, and lens flare added to the “2d filter” actuator.
I’m ranting now, so I must stop before i’m up all night
I actually found in the realease notes for 2.64 they had a shadow casting mechanic that they added. I’m still trying to find it though because I can only make the game receive shadows from the light instead of the objects getting shadows from the light source and objects
You have to enable GLSL for your shading (if you have a GLSL-compatible graphics card). Then, use a spot or sun lamp with variance shadow selected as the shadow type (in the lamp settings).
Oh ok, I forgot about that feature. Quick question though. What does GLSL stand for?
ok i tried what you did but i got the results i got was weird.
how i got to it was this.
started a new progect
made a plane under the default cube
activated game engine and switched to GLSL (note all check bockes are checked)
switched the lamp from point to sun
switched buffer type to varience
plane and cube share the same material.
i have an nvidia gt 400 or 300 series.
so the result i got is the cube is invisable and the plane is full blown white. (ok found out it’s just pitch white like the plane)
sorry this is turning into a help thread rather than a discussion thread. i will still love to talk about development for particles in the game engine.
It is in Textured mode, right? Perhaps you should share the blend file, as it sounds like you’ve done everything correctly.
GLSL stands for OpenGL Shading Language.
I don’t think the rendering engine can be merged in with the game engine rendering system (or vice-versa). Someone more development-savvy could explain why, but there’s major differences between rendering an image with Blender and updating and drawing a game screen with the BGE. One more noticeable difference is that the BGE (and OpenGL) runs on the graphics card, while Blender has, until recently with Cycles, been rendering scenes all on the CPU. That means that the rendering system most likely isn’t made to take advantage of the graphics card (and so, isn’t optimized or made for it or drawing via OpenGL). Cycles, while using the graphics card, if I recall correctly, also doesn’t use OpenGL to draw things, but rather does so via raytracing. I’m totally not sure about this, though, so I’ll shut up about it until someone more knowledgeable comes along.
Where do I go to send the blend file?
Now I know the difference, but the main thing I want to discuss is implementing is advanced stuff like the particle simultion such as smoke and waer physics. Many games to date have been able to pull this off, but it takes a lot of cpu power. That’s why I started this thread in the first place.
For the blend file, you should be able to upload it here by going into ‘Advanced’ edit mode and attaching the blend file. Alternatively, box.net / MediaFire would work.
For the particles, I couldn’t see doing water or smoke particles very accurately without a custom-made engine (like Hydrophobia had). If you wanted to just fake it, there should be ways to do it currently, as well as resources that could be useful to be implemented into the BGE.
You’re correct. Non realtime renderers use heavy calculations that can’t be used for realtime rendering (like raytracing algorithms for example or path tracing in the case of Cycles).
Realtime renderers use dedicated hardware that are suited to the specific application of realtime rendering (like GPU’s and integrated GPUs). It depends on the algorithm, but it’s very rare for a realtime algorithm to use mostly the CPU, most realtime algorithms exploit the GPU hardware.
Now, we are starting to see non realtime renderers use general purpose computing to make use of the GPU, but this has nothinig to do with rendering in realtime in 30-60 FPS. Cycles for example uses the GPU to speed up algorithms that are still pretty non-realtime in terms of computation.
Most of the time, I mean 90%+ of the time you can’t mix and match algorithms. Maybe you can share concepts but not algorithms. So for example concepts like voxels to name one.