Well here’s my entry
Non competing
" Man apart "
Been a while since I used cell fracture and rigid body.
Insane in the brain
Pure entry, all made in 2.8 cycles
First wanted to name “In the brain” but bridgeing the neurons are insane.
GPU render time: 2m18s (samples: 128 denoised)
Win7 64bit
Intel Core2 Quad CPU Q8200 @ 2.33GHz
8 Gbyte Ram
Nvidia Geforce GTX 660 (2Gbyte)
Thought-bound by Fate
Open Entry knitted texture form textures.com. Composited in AE
I found it interesting that textures could be applied to curves, modelling the brain using curves was well and good but when it came to texturing it, I fell short in execution.
The length amount of segments and angle of the vertices all have a huge impact on how the texture turns out.
I haven’t given up though, Ill look more into it and try for a better result on my own time.
I was going for a red string of fate and how it could drive thought.
Once you have the curve the way you want it, could you convert to mesh? I think that might make texturing easier.
True, I really wanted to maintain the flexibility and performance benefits of the curves though.
I tried converting to a mesh to add some fuzzy hair particles(it needed more adjustments so I left it out).
The uvs were all over the place, I’d likely have to separate all the…membrane thingys as different meshes, but i wanted to keep the scene simple.
But thanks if all else fails its what I’ll do.
Which tutorial?
the character modeling tutorial by Riven Phoenix. I loosely followed it, and then tweaked to make the male looking face into something more feminine.
Appreciate it!
You could try blender’s built in denoising feature, or there’s an addon that uses nvidia ai to denoise (if you have a nvidia card). This way you can use less samples and get a better picture for less time. From my limited tests the nvidia denoise addon is very effective. Addon link:
Thanks. I was already using Blender’s denoise, but maybe I should spend more time tweaking it. Alas, I have an AMD card, so I can’t use that fancy denoise.
But there must be other tricks, no? Indirect lighting is very important for ArchViz for example, and I doubt those people render with 50K samples.
There probably are, but I’m sorry I don’t know any. Best of luck finding them
I don’t know of anything fancy but there is a composite denoise node in this node pack: b-wide
I think it works for scenes like this but for scenes with characters or that need to be sharp and clear it seems to make things smudgy.
There is also light clamping then you pump up the the energy on compositing.
I’ve wonder if its possible to go for a hybrid workflow for scenes with volumetrics, volumetrics in eevee and reflections/refractions in cycles. Haven’t tried it out though.
I think many people know of this but I always fall back to it when i’m fighting noise. https://www.blenderguru.com/articles/7-ways-get-rid-fireflies
Ok, now I understand the point of the branched path-trancing thingy. You can specify how many volume samples and how many diffuse samples and so on you want. So in my scene adding a lot of volume samples but less diffuse/glossy can yield better results in the same render time. It takes some time to find the right numbers though.