Is that possible?
i mean not in the driver, by the blender-gam-player itself…
Is that possible?
sure, you could modify the source of the player to ask the driver to enable antialaising for you…
but other than that, no
Can you turn off anti analiasing totally for GE render? your game would run faster if it had the jagged edge look wouldn’t it?
Anti-aliasing takes computational power and yes it will slow down the game. But if you have a real good GPU it shouldn’t be much of a problem.
but what if your going for that Saturn/Psx look? i want the jagged edges
i’d like to know, just HOW you activate it. as some might know, i’m working on a walktrough game, so its mandatory to have clean edges…
What graphics card do you have because I can force my radeon 9250 to anti alias.
on nVidia and windows, download latest drivers, right click on the desktop, and select nVidia tools or something, then look around for the performance tab (I’m not on it at the moment, so I can’t remember exactly) then look around for antialiasing. Turn it to around 2x or 4x then go into Blender and move the default cube around. If it’s antialised then the game will be antialiased, open your game in blender hit p and voila
OK guys I work on the source for the GE and plz trust me when I say you dont want Anti-Aliasing for your blender games. My advice would be to try to make speed improvements in as many places a possible in your projects and just try to aim for stable play at a high resolution like 1280*1024. The majority of the hardware that is currently in people homes is not able to handle AA at reasonable resolutions. And yes you still have to have a high resolution to make AA work effectively. Another Thing I should point out is that OpenGL’s Anti-Aliasing methods are fast yes, but work more like a blurring algorithm than and Oversampling algorithm, and they just arnt fast enough for complex scenes.
The AA capabilities in OpenGL are more dirrected towards games that use OpenGL’s 2D capabilities, and low poly demonstrations, but not gameplay.
Of course there is hardware out there that you can buy that can do some kick ass AA in openGL no prob, even on billion poly scenes and are even available to buy. But the majority dont have that hardware right now.
I could put in AA within the next hour or two no prob. But so could the few other guys who work on the GE. The reasons above are why nobody has done it.
if it would only take a few hours i’d REALLY appreciate having the choice…
i’m not bothered by blurring as much as aweful jagged lines that along with lack of light blooming really makes a scene look flat and computer drawn…
i’d prefer to make a simple scene that was tidy…
dude I guess you didnt get it when I sayed Its not for gameplay on standard hardware. Its even slower if you try to use it with the current GE.
Its not to much of a problem if you coding standard OpenGL on a small program, and have anti-aliasing turned on.
But Blender’s GE goes through a very complex, and yet very nessesary pre-frame-rendering procces to just get things on the screen. Throwing AA in there would cause a extremely severe slowdown.
And is there really a point??? Seriously I dont know of a single 3D PC game out there that runs with Anti-Aliasing as a default display setting. Consol systems dont even use it, with the exception of the 360 and the PS3.
And lets do some math shall we???
Game running at 1280 x 1024 on a 3.2Ghz P4:
Calculate and write 1310720 pixels to the ColorBuffer (average 23 floating point operations per pixel)
Calculate and write 1310720 pixels to the DepthBuffer (average 7 floating point operations per pixel)
Calculate and write 1310720 pixels to the AlphaBuffer (average 12 floating point operations per pixel)
This is excluding the Accumulation buffer along with custom buffers, cus bender dont use them.
This totals to an average of 55,050,240 floating point calculations per frame drawn.
Under usual proccessor conditions a game at that resolution can hit 55 - 60 fps to prob. Another thing to point out is that at that resolution, jaggies are hardly noticable, unless you are intentionaly looking for them.
Lets say were using OpenGL’s anti-aliasing method, which is a post rendering bluring algorithm. This requires an additional 15 float ops per pixel per buffer.
So lets just take (1280 * 1024) * 3buffers * 15 = 58,982,400 additional operations.
58,982,400 + 55,050,240 = 114,032,640
Is it just me or did the anti-aliasing pass require more than what it took to just render a scene normaly. And look at the total! its doubled!
And oh god! It dont even got a chance of reaching 30 fps!
Think 20 - 25 fps
Not to mention how hard it is on your video card.
Of course there or other anti-aliasing methods we could try out, But unfortunatly they all require we draw the frame multiple times, and As a result the number of calculations required grows exponentialy.
Now do you see why its just flat out not a good idea???
Hell my own engine done even use it for these reasons.
And how the hell does light bloom get invoved in this???
( Use of the accumulation buffer for a specular bloom is possible and wont hinder performance to much, I just havnt thought of a good way to integrate it yet. Of course a nice bloom would require that the users card have support for frament shaders, though there is the ability to use a pure vertex method instead. )
Ok just to make you happy. I figured I would do it.
For some reason it only works with wire frame objects to I will have to look into that.
As expected the frame rate was cut in half
Well now that you all know its possible then I guess I have to do it now dont I.
I would like to see other features in the Game Blender before AA. Features like multiplayer, and the ability to edit and create new objects while the game is running. Sorry, kind of taking this off subject…
Multiplayer is alredy posible, just requers some Clever programming, Learn python
Yeah, Ive already done some socket work in python, but what I meant was built in to the logicbricks so that there is a level of uniformity, security, and speed. Being able to attach an actuator to an object to send information to a server would be great! Being able to edit objects during runtime would allow mesh data to be downloaded and hence create worlds based on server information.
Funny you mention mesh editing in realtime. Cus thats a feature im workin on for the GE.
why I would need AA is a simple thing, to try and use it as a render engine Yes yes I know, it does not raytrace… I just need thing like normal mapping, displacement mapping the good fake stuff dynamic lighting… Ah thats the stuff I need. GPU stuff is to replace CPU some day soon, and can be done now… if such and suchh
Ok dude sry to say this but you are just not being very smart at all.
Why the hell do you want to use the GE as a rendering engine when blender comes with two different engines already which both support overSampling up to 16 samples. Incase you didnt know… OverSampling is the same thing as antiAliasing.
OpenGL antiAliasing aint that good anyways. So why do you want to use it for rendering???
And what gives you the idea that a GPUs could ever pass up/replace CPUs??? Dude Im a graphics programer, and I even use GPU drivers to write my own graphics engines.
Dude your GPU is your freakin video cards proccesor! What the hell makes you think it can pass up a CPU??? hello! cmon man, be practical.
Well the game is, Xbox 360 … Hell even Shadow of chaos, I can name more games that show the beauty of CG cards. True the output is not full on real but getting the different passes out as numbered images with AA they can then be recomposited in for quick animations… Demos for products so on.
It’s not a great fight to understand it’s just a faster method to get animations out. Yes it drops down to fps 15 12 hell even 2 But that is STILL faster than a bare bones scan line render at the same screen rez
Faster methods to see test and use animations… Glows HDRI ergg so many to test but with no AA it is neat for games only