new mat. nodes...what the heck ?

There are some by default but 3dmax has the greatest use so far , they have normal mapping while in edit mode…

AFAIK videocards do not raytrace, nor their functions are particulary useful for raytracing or GI. :frowning:

As for using realtime shaders - you know, it’s cool and spiffy and even useful (rare combination) but I’m against using them in Blender (yet). Why? because Blender’s stability and frugal system requirements are its important strengths, and for me 50% of the reasons i love Blender. with RT shaders we most probably will lose one or both to quite serious extent.

EDIT: hit return before completing post.

Someday when shaders 3.0 would be everywhre, - it would be great. But now you’ll need either dump support of all but latest cards or write countless lines of difficult-to-test hardware specific code.

ok i dont get your point. can you repharse that?

i dont remember the website but i read about raytraced calculations done by the GPU and not the cpu. i do not care that much about the pixel shader because at the same time i do not know how flexible or usefull that actualy is.
i mean they are made for games not film. well i dont know if it is possible to make a spin off from that, but in case that would make sense maya or otehrs would have jumped ont that train already.

Sure about that ? They to tend to lag behind on stuff sometimes case in point soft modeling and whatnot…

The idea is these nodes could call up something to give cleaner previews in the 3d view not replace anything just show us what something will look like on the full mesh. And have pixel shading lighting to light the scene faster, there is some vertex lighting already just need to amp it up to pixel level…

Just hope is all… That and the nodes start to spit out text that can be hand edited for even finer faster control of the shader

Wich one? About raytrace or about GPU usage in general?

I’m really not sure. Theoretically, main pipeline of vidcard is useful for scanline only, Shaders and things like occlusion queries can be useful for raytracer, but i’m not good enough graphic coder to say how exsactly useful.

There are some demos using GPU and raytracing but demos are very different from real apps. There were also works on raytrace-gpu (could not remember either name or link)…

BTW, here’s the good articles on realtime raytracing http://www.devmaster.net/articles/raytracing_series/part1.php (haven’t read it all but beginning seems good.)

hi Trident

well i am not a coder at all :wink:
but it hink i understood your point, i just thought that the GPUs do support raytracing functions today, but well i think i actualy mixed that up with the specs i read from the new playstation.

youngbatcat,

i never have heared about “point soft modeling”, maybe i know it under a different term. what is it? i browsed google but didnt find anything usefull.
maya updates its software pretty often, compared to what maya stands for. i guess they focus more on what studios need and not what people might love.
work or play …

i woulnd say that the nodes will give you a better preview of the texture.
node or network based shader systems are a bit more flexible without the need of coding expereince of the user like me. well i coded shaders for 3delight while i used pixels 3d studio at one time but it was a pain for me.

i think also with nodes you are byfar more able to create combinations you would not be able with fixed material systems. i find it funny that c4d who is in the business for so long still has this ugly shading approach and blender now comes with it.

the one thing i would love to see blender be able is to have a better texture preview of procedual shaders in the 3d view. that is something which makes maya very enjoyable next to their interactive renderoption.

Gabio updated hes linux-win builds:

http://www.blender.org/forum/viewtopic.php?t=7706

this is getting complex :stuck_out_tongue:

I’ve made quick test with cvs version and I’m quite amazed ) Now we need python textures (best if they are similar to renederman shaders).
Here is my screenshot:
http://grzybu.com/images/3d/node_editor_20060104.jpg

new osx
via http://www.graphicall.org/builds/builds/showbuild.php?action=show&id=68
thankyou graphicall.org

:smiley:

added new bugs and shading of noodles

this is awesome, but i have a hard time figuring how exactly the geometry node affect normals. It’s totally out of control :stuck_out_tongue: It doesn’t change anything on the previewrender, but the final result is totally different so i’m a bit lost here.

The way you can rotate the normal node is really nice, i wish i could do that on output nodes :slight_smile:

And node groups are the best thing since sliced bread. Once they can be made single-users i’m sure they will be able to slice bread themselves !

Great work ton :stuck_out_tongue: Now i need someone to stop me from playing with this all day long :frowning:

-efbie- As I said I´m quite lost too with this new build :o but you can use extra “output” nodes to test what each geometry node output do, just link them to the color input %| anyone who rally knowas what hes doing should explain it for use :slight_smile:

you know…

i think this would be the time for me to ask those of you who’ve messed with this/know a little more about it:

you think we could get a quickie tutorial/explanation of the way this system works ? even a list of shortcut keys and such also…maybe it could turn into a tutorial for when its actually released.

because i know im begining to understand the concepts, but i still don’t quite grasp the whole usefullness(or even use) of these new nodes.

please ?

the renderman support through this would be great but than the code the node editor would create needs to be the same the shader compilers could understand, and thats a point where i do not know how easy that will be since blender has its own renderer.

as a kid i played a lot with shadermaker pro for pixels 3d studio.
and like with mayas shader system they had many many nodes
like math functions, normal functions, diffuse, specular functions,
a node for silk, skin, clay, etc surfaces.

i hope that the node system will get that as well which will generate the question about what they will do with the old shader/material system and also if they will add soon mroe light models.

Would SSS be possible this way?

well pixels for example has a sss shader node,
they use depth shadow maps as far as i know.

this depends not on the node editor but on the light model they have to include into blender and the render engine.

The Geometry node works (or can work) like the Glob, Object, UV, Orco, Win, Nor, etc. buttons in the “Map Input”-panel (button bar, material buttons).

Try to recreate this and play around a bit with the sockets:
http://images5.theimagehosting.com/geonode.png

In the latest additions to Orange CVS, there’s also a “Mapping” node that controls the same as the “sizeX”, “sizeY” and “sizeZ” options do in the “Map Input”-panel (and a lot more than that!!).

zupermonkey: that is a great example. with some nice explanation to ! anyone else care to contribute with some examples that explain the different thing one can do with this new node system?

I’ve been playing around some more with the nodes. I really love the grouping (so nifty!) but there should be a visual way of doing it (eg a “grouping” icon) and not just the shortcut.

What is really bothering me is adding textures. I am no expert with Blender materials but you seem to need to create new textures by linking them to a material. This seems counter-intuitive - if you create a new texture node you expect it to automatically create and link to a new texture slot. There should be an “add new” button on the texture node and a drop down list of available textures.

I am also confused by how the materials can be affected by textures “implictly” (ie not visible on the node system) by using the old method or “explicitly” (is there a difference?)- ie create a material, add a new texture, create a texture node, UNLINK the texture from the material!!, link the texture to the texture node and THEN relink the texture node output to the material node input!

This is rather convoluted and seems kinda like mixing two systems. All I want to know is whether I am doing something wrong now or is this just the way it has to be done because it is still in early development.

I am also confused as to whether the majority of the settings will remain in the materials panel or whether you will have access to them from the nodes (which would be much better). I think the nodes system should replace the old system completely - or be an exclusive OR type idea. Again, I am confused as to what is happening and what this really means.

Thank you.

Koba

P.S> The nodes are so much more flexible and intuitive than the old material system (which always confused me somewhat)

Found it Openrt
http://www.openrt.de/

Iiit has GI ! http://www.openrt.de/Applications/globillum.php

And? they clearly state that main renderer is software (see http://www.openrt.de/Applications/games.php ) OR a special raytracing processor (not common GPU) may be used.