Real time test

Questions answers… A new example of realtime v.s. Blender :smiley:

This was created in anotyher app that displays particles in realtime. That mov was played and created in realtime. Blender’s equlivent can not preform such a thing, like all I get from the view are dots, tiny pixel dots… That does not tell me anything of the render it will bring. I have not tested out game engine for it yet but that is like a whole nother beast.

Are there any intrepid coders that have thought of getting blenders open gl 1.2 up to 2.0 ?? It’s this kinda instant feedback that is sorely missing in Blender…

The problem being that OpenGL 2.0 is not supported on all cards. Let alone all platforms.

GPU acceleration would be nice, and doable, but it would force everyone to upgrade. And there are still some who run blender on 200mhz machines, and it still runs like a charm.

Ok fine I get that somepeople are still running on su[per old machines. I wont even go into the reasons why …

But what I do see is the fact that people are still doing this they seem to have this powerfull hold on us to never really get these new tech features. Why are they the ones that hold development back in the GPU relm ??

And why can’t there be a cpu fall back ?? As in Dont even display at all. Or Add another text 3d videw toggle so we could have Box, Wires, Shaded, Light, Texture, then Opengl pixel shader ?? That way nothing upsets the 200mhz crowd…

I’m sorry I just dont get it. I see so many small test apps from geek kids playing with code and they can get the pixel shaders to display fine in their little apps, why does blender lag behind on this ??

well one thing you have to keep in mind is how usefull those nice halo effects are for real work situations. not at all.

but similar like in Lightwave they use a sprite based hypervoxel proxy system which lets you see somehow how your scene will be before rendering.
they started things like that with real time lens flare effects some years ago.

one one point i have to agree. in case the argument others use low end PCs has an impact on the devellopment, than i can only shake my head.

unfortunatley i have no idea about how openGL 2 is supported by cards you get today. but those effects in Lightwave were done with older openGL versions already.

i am curious if using openGL for such previews could be tunred on and off to ensure everybody can use blender on his pc.

Or annother point may be that no one has time to write it. A realtime renderer takes allot of work. Were’re talking shader programming. As well as the fact of how on earth are you going to do raytracing? Or is raytracing disabled for GPU rendering?

But hey, if you want it, code it.

i’m running this stuff on the lowest end cg of pixel shaders and 64 mb ram card nvidia 5200 and it works fine. That sparks test is nothing yet… I am trying to test smoke and things as well… Whats weird is that UV textures will show up clean in the 3d view. just add to that,

Come on people Blender is being used in a production enviroment, lets try and give it radid protypeing view tools and visual feedback…

We dont need raytracing, just pixel shaders… It’s still just a visuall feedback tool not a full on render yet… Could be with polish, but Seeing how lights spot light the sceen on a one poly wall but still show the lights shape… Shaders … Mapping spec so forth

The old “Were’re talking shader programming” should not be a huge problem for a smart coder. I mean it is still just another language and tool. I think thats a cop out… Sorry it just confuses me when I see smaller hobbyists createing stuff just fine with it.

Sounds good as long as it is an option, and not required. But, like I said, someone has to code it. If you want it, code it.

I’ld like to point out that all the GLSL stuff being implemented into the game engine is already OpenGL 2.0, so in fact blender has already gone the non-fully-compatible route.

Conversely, I have an Nvidia 4 series card that is less than 2.5 years old and was top of the range at the time, and it doesn’t support opengl 2.0, so obviously unless 2.0 effects were an option you would cut out a lot of blender users.

I never wanted to cut out anyone, why do people keep thinking that this will cut out blender from small computers ?? It’s just another view option. But HUGE one at that, the kind that takes a ten hour day of testing and testing down to a one hour testing phase and 9 hour creative work day instead…

I understand now. In the past there were those who thought that blender’s internal renderer should be 100% OpenGL. I think you idea fell victim to the backlash to the previous ideas.

That, being said, it shouldn’t be that hard to just add annother view option.