The following is a list of insightful public service announcements that should make it a little easier for some of you.
~~~~~ Releasing Games ~~~~~
Can you, by law of Copyright and Protection of Intellectual Property (IP), distribute your games as purchasable goods?
Usually yes, but perhaps not; You own your .blend files, they are your IP, however you do not own the executable file exported by the Blender Application, nor is the end user bound by law NOT to redistribute that very executable.
The most common work around is exporting an empty, dumby executable that loads .blends dynamically. The idea is that you wouldn’t be selling the exe, you would be selling the game content files. And that end users will be redistributing an empty application that will load a blank window.
So they can’t distribute anything?
No. But they most assuredly will. This is why you would use something like the BPPlayer, or Solarlunes little hack that locks .blend files; They cannot be opened by blender, nor can their content be appended from.
~~~~~ Render Engines ~~~~~
Why does my game look different from the 3D viewer window?
The 3D viewer window is designed to be a simple, preliminary display of your work, to demonstrate, only simply, what you’re working on. Though they are both OpenGL, the game engine uses far more GLSL:
The game engine uses GLSL shaders like fragment shaders, pixel shaders and vertext shaders. The 3D viewer simply uses BGL and OpenGL functions that are designed as simple demonstration methods.
The background will change from the interface colour, to the background colour of your “world settings” tab in the properties window.
Alpha textures will behave differently because they aren’t coded to be fully functional in the 3D viewer window, because they don’t need to be. Alpha textures in the game engine are handled far more elaborately and more accurately.
Logic steps and physics steps, by default, start as soon as the game engine starts; as soon as you press ‘P’. If anything in your logic, ANYTHING, dictates that something in your scene will change, then it will change when the logic specifies, sometimes, if you’re not careful (or even by intentional design) that change will occur in the first tick of the game engine runtime, and will not revert until, again, specified. If not specified, then it will appear as if there is a discrepency between the 3D viewer and your game, there is not, it’s just your game playing out. If it seems that you didn’t want it to happen, don’t rule out that it was pilot error.
Why does the Blender Internal Render look different to your games? And cycles aswell.
The Game Engine uses Open GL, through GLSL (I hear, though, it can use Open GL directly, but I haven’t confirmed this , nor witnessed this confirmed) which is based entirely on the GPU and it’s functions, and the rest of the Graphics Card. The Blender Internal Renderer AND Cycles are both rendering engines that use specific functions that are completely unrelated to the GPU, they work entirely on the CPU. Even though Cycles can make use of nVidia CUDA cores, it is not designed, in anyway to be used in realtime, it’s not efficient enough. Cycles and the Blender Internal use complicated ray tracing rendering for lighting. It involves sending rays from the light sources and/or the camera, and calculating the colour and intensity of light per pixel through the journey of these rays. These rays also bounce off of surfaces, or are even scattered inside surface (Subsurface scattering; SSS).
tl;dr : The BI and Cycles use ray casting and light bouncing technology. Whereas OpenGL uses fragment, vertex, and pixel shaders, and approximation techniques (for optimisation and practicality) for real time frame by frame rendering.
~~~~~ Logic vs. Python ~~~~~
An age old debate; the idea that logic bricks are easier but less flexible, and python is convaluted and difficult to learn is nonsense.
I’ve found that logics bricks are far more convaluted and elaborate when used to make game features that could otherwise be a < 100 line python script. But python is most assuredly harder to learn, and harder to understand than logic bricks.
If you’re interested in getting the game done and dusted, than you could very well use logic bricks, they can be used to do just about anything. I believe I’ve even seen an RTS being made entirely with logics bricks, and it’s magical. However, python is far less painful, limiting, and difficult to use than logic bricks. You can even use logic bricks INSIDE python scripts, because logics bricks are actually coded IN PYTHON (Really, they’re coded in C++, but interface with python through a python API, for accessibility in python scripts. Thanks to SDF Geoff for clearing that up). Not to mention that you can do far more things in python scripts than you can in logic bricks; there is only a limited amount of logic bricks. For instance, you can alter and affect game engine materials in real time with a python script (and with the addition of a little knowledge of GLSL, you can do some fancy sh*t).
~~~~~ Sound files ~~~~~
When using sound files in games, it’s best to use the smallest, most compressed files that diminsh the least amount of quality possible. Commonly, game makers use wave files (.WAV), these are HUGE, but hold the best, rawest quality. However, .WAV files are Microsoft files, and have compatibility issues with Linux and Macintosh.
What is the best sound file type?
The situation isn’t: one size fits all.
MP3 files are small, and globally compatible, there are even devices that play ONLY MP3 files. They are very small, and very good quality. HOWEVER, MP3 files, under certain encoders, WILL ADD EXTRA EMPTY SPACE TO THE START OF A FILE; when exporting MP3 files, the encoder will add a second of silence, if not accounted for in the development of your game, there WILL BE a delay between the causation of the sound, and the sound itself.
Ogg Vorbis files (.OGG) have very similar size and quality to MP3 files, but are less compatible; their compatiblity is only limited to Linux, Windows, and Mac, and some mobile devices. Other mobile devices do not recognise OGG Vobis. They also, on the other hand, do NOT add silence in encoding.
When it comes to deciding what file type to use for sound, it’s usual that MP3s are used for music, because the time the music starts isn’t as important hearing things react in space, in real time. However, .wav files are usually used as the smaller, shorter sound effects, like gunfire, collisions of small objects, small speech, and footsteps. I would personally recommend OGG Vorbis files for EVERYTHING. .wav files are much too big, the size of a game can very much influence someone’s decision to download it at all, and .mp3 files are far too inconvenient with that God awful silent introduction.
Something to note, though:
Though we can’t confirm any reason why this would happen, it’s safe to bet that it’s probably because WAV files are raw sound files that don’t need to be de-encoded and uncompressed, whereas; mp3 files need de-encoding and thus, use up the logical process. (Unconfirmed, but the statistics support the idea that raw is better for performance, but compression is better for storage and overall file size)
~~~~~ Texture files ~~~~~
You hear a lot about streamlining games with specific types of image files. For instance, TarGA (.tga) is said to be the fastest because it compresses the best. Portable Network Graphics files (.png) is also said to be best used for small textures with alpha channels; such as foliage, fire, smoke, and glowing effects. And JPEG (.jpg | .jpeg | .jpe) is claimed to be the most common, and best compressed (No alpha channel though).
What IS the best file type?
Much like sound files, image files are also highly dependant of the situation the textures are used in. As mentioned earlier, .png uses alpha texture, as does .tga, and these are best used as smaller textures when alpha texturing is enabled in the material settings. They both compress very well, but no better than .jpeg.
With a colour image compression test, with no alpha texture:
The JPEG compressed the 512 x 512 image to 160kb
The PNG compressed the 512 x 512 image to 161kb
And the TGA, 512 x 512 colour image, 855kb
With an alpha compression test, PNG vs TGA:
The PNG compressed the 1024 x 1024 image to 874kb
The TGA compressed the 1024 x 1024 image to 3.56mb
So overall, it would seem the TGA file format is actually useless. However, compressing with PNG takes a lot of power and time, simply using the maximum compression on a small, 1024 x 1024 image, it took more than 5 seconds. That doesn’t seem like a lot, but compared to the instant compression of TGA or JPG, it’s more than approx. 30 times slower. So when compressing large terrain textures or height maps, use .jpg, don’t use white and alpha, use black and white.
Alternatively: (As pointed out by BluePrint Random)
DirectDraw Surface (.dds) texture images are optimised for compressed states in vram, to save storage and improve performance. Accessible in GLSL through the ARB extentions, available since Open GL 1.3.
I’ve not used them so I can’t really comment further, but I would highly recommend acquiring a GIMP plugin to export with .dds texture formats (Or Photoshop, if that’s your preferred image manipulation application).
If anyone would like correct anything I claimed or remarked on, you absolutely should; It’s better to be right than proud.