it's not the brush, it's the painter, but we are getting off topic.
it's not the brush, it's the painter, but we are getting off topic.
"The crows seem to be calling my name." Thought Kaw.
Myrlea, "The Shepherd's Quest" formerly "Valiant" [project]
Motion Blur may require previous frame's data to render, perhaps an animation of more than one frame would be successful instead of just a single frame?
Cheers, David ___________"awesome in space and other places".
________Follow me on twitter @3pointedit
_____________ Check the Youtube Channel for tutorials. Or my Sketchbook
No, rendering an animation is useless.
Motion Blur was first post process effect added as demo when nobody knew that we could make eevee renders through OpenGL render.
Obviously, OpenGL render was not adapted to render it. It renders Bloom, DOF and AO.
Maybe it is because currently motion blur is limited to camera moves and it will be done when more objects will be supported.
Maybe it is an omission.
It will be good to report it.
Last edited by zeauro; 15-Jul-17 at 07:14.
Another little vid (not mine):
Last edited by Ko.; 17-Jul-17 at 09:34.
Seriously, I need to give this a try once I am done with what I am working on. Can this eevee be used in game mode to walk around a building interior in Blender? I don't think Blender game mode works with VR yet. Really love the vids posted on this thread.
Also the name is cool sort of. Eeevee. I wonder what it stands for.
I posted a little test of an old scene in Eevee here:
https://blenderartists.org/forum/sho...19#post3218119
In some ways I prefer it to the ols cycles render. It was much more fun to light.
Glenn
I'm seeing the roadmap, what happens with soft shadows? Was posposed or something?
Yes. It was postponed.
When I asked the question to Clément, he replied that he wanted to do transparency, first.
Dev-meeting about eevee happens on Monday. Maybe, tomorrow's report will include an official answer.
Blender Developer Meeting Notes for Sunday 16th July 2017 https://lists.blender.org/pipermail/...ly/048521.html
Hi all,
Here are the notes from today's 14 UTC meeting in irc.freenode.net #blendercoders.
1) Blender 2.79 release
- Test build 2 ready!
https://download.blender.org/release/Blender2.79/
- The first 2.79 release candidate then can be made after next Sunday, just in time for Siggraph.
- Release log is shaping up. Thanks Brecht van Lommel and Vuk Gardašević (lijenstina):
https://wiki.blender.org/index.php/D...ase_Notes/2.79
- Tracker, tracker, tracker! Everyone help!
https://developer.blender.org/maniph...ug/query/open/
2) Blender 2.8
- Everyone's very happy with the attention for the 2.8 viewport project! Keep the demos coming.
- We expect more demos and feedback on the new workspaces, layers, depsgraph, grease pencil, asset manager and manipulators in the coming months.
- Freestyle for 2.8 - Ton Roosendaal will check with Tamito Kajiyama.
- Tomorrow the crew will send a detailed 2.8 planning/report, as usual.
3) Google Summer of Code
- Reminder: if you need test builds, any committer use buildbot to make a build with a patch:
https://wiki.blender.org/index.php/D...t_Experimental
- In a week reviews for the 2nd evaluation starts. Students should be able to show first working
prototypes or tests. Share screenshots or (youtube) videos on your wiki!
Laters,
-Ton-
The volumetrics will be friggin awezome![]()
ASUS ROG Maximus IX Formula - G.Skill RGB 16GB - i7 7700K Kaby Lake, 4.2ghz - ASUS GeForce ROG 1080 Strix 8GB - M.2 NVMe: Samsung 960 500GB - Windows 10 home
Here's a video of my test scene
https://www.youtube.com/watch?v=a6p2g1xFJJQ
@Ko.
@grsaaynoel
Amazing demos!!!
Be patient, English is not my language.
Grease Pencil object have modifiers, now.
Subdivide modifier has an obvious interest.
But Noïse modifier is not great at all.
I would rather a seed parameter instead of a random seed that change each time, blender is refreshing.
How can you expect to provide compositing, refining work or retakes if you cannot be sure to render same image at same frame ?
And basically, we can only modify one value for both amplitude and period of noïse.
IMO, It would be better to have different categories of modifiers like freestyle ones instead of one modifier with few general settings applied on several unrelated characteristics of strokes.
Edit : Tint and Thickness can be useful allowing to override several layers.
But even with a step setting, Noïse modifier is still stupid.
You render an animation using this modifier.
You close your file, reopen it and render again -> Result is a completely different animation.
Noïse seed for grease pencil should be same noïse seed as for freestyle, not the random seed used for Cycles Sampling.
Last edited by zeauro; 18-Jul-17 at 06:50.
Subdivide modifier is gonna be really cool to export nice smooth strokes !
Here's a small test of Eevee with Judy model from Slatypi, really good performance for the hairs but I had to disable the shadows who creates ugly result:
EDIT : Actually with proper updated drivers it's a solid 70FPS :
Last edited by Yadoob; 18-Jul-17 at 09:33.
Isn't the volumetric supposed to be one of the last part that have to be developped according to the roadmap ?
https://wiki.blender.org/index.php/D...MayAugust-2017
may thats just
Week 4:
Fog simple (1d)
i think Clément Foucalt is working on some things in advanced and other hold up.
In the end of this month we will have something very nice to look up for siggraph.
You take World Volumetrics as an early gift
Volumetrics materials for objects is still missing. And I'm not sure if it was possible later to also have realistic smoke simulation for example in eevee.
Be patient, English is not my language.
@Yadoob that's amazing. could you try and translate the model? move it around a bit. I found that changing viewport was smooth but moving the object was choppy.
About Eevee development:
DISLAIMER: I won't copy and paste everything we [hypersomniac/Clément] talked about in e-mails, I don't think it's fair. But this quote is probably OK.
So in one of my videos I talked about how I was syked about Refraction (because I saw it in planning also), now that we have Transparency. I wanted to do some glass shader renderings. That was erroneous because I didn't know how intertwined refraction was with having screen space reflection done. Or rather much of that code approach will be used in refraction as I understand it.Also keep in mind that plans may change and are not set in stone.
So currently I think he's working on implementing SSSR (Stochastic Screen Space Reflection) and his implementation will be something similar to this paper from DICE/EA
And if you read the papers, which I have a couple of times now but still doesn't understand all of it. But I get the grasp, and it's quite much to develop.
I hope it can be done before SIGGRAPH because it would probably be a extra feather in the hat for Ton when showcasing 2.8, also when we have a method to raytrace the depth buffer from SSSR, that buffer can be used for Refraction.
Anyways that's what I know, I will share with you when I know more. Also Clément has seen some of my videos and answer lots of my questions and some misinterpretations I've done when I've speculated what's coming next for Eevee. So we're going to keep email contact when new features are implemented and I can record a video and probably share it with him before I upload to youtube to keep errors to a minimum.
Exciting times!
I was looking at Yadoob's model which looks amazing by the way but I noticed the hair wasn't casting a shadow properly. Then I recalled this video I saw which I will post below if I'm able. Does anyone know if this could be implemented or if it will be implemented?
Also attempting to spread awareness in the event developers aren't aware of this.
@aermartin yes indeed : adding object, translate/rot/scale are still choppy
Clément is outstanding, keeping a good pace with such complex subject is insane ! I'm looking forward to SSR.
@savage Interresting paper, result looks very good. Right now the shadow for the hair is a bit buggy :
![]()
Bookmarks