E-Cycles - The fastest render engine for Blender. 3.2 release available now!

Hello there,
looks very promising but I am bit confused about what do you actually sell here. So is it some kind of tweaked Cycles build that you can install into Blender or is it a course with you would learn how to tweak a build on your own? What pricing is it going to be? Or will we get more info on Saturday (29.12.2018)? Any Cycles speed up is more than welcome and for a such a feature I wouldn’t mind pay.

Thank you for info

There will be both:

I can’t name an exact number yet, but it will be under 9€ per month for the builds. So that even if you only own a GTX 1060, you actually get your investing back instantly (it’s equivalent to buying another one in many cases). On top of that, you also spare on the electricity bill, by halving the render-related cost.

My guess is 2’ 40"

Thank you for info, I will bookmark this to check out Saturday info.

T.

As some asked, the time limit to try to get one free year will be tomorrow midnight Berlin time, so you have about 35.5 hours from now on.

bliblubli,

Eevee is going to be important for Archviz where accuracy is not as important as time. Especially for animation/Archviz walkthroughs. For many non-commercial Blender users like myself buying more Gpus/Machines is not an option. Look at the gallery their is some nice Archviz done in Eevee already.

I agree their are some tricks to learn to work with Eevee and its limitations, but once learn they are easy to reuse. Even Cycles their are things to learn especially for interior scenes with small windows/lights to reduce noise.

Eevee is improving rapidly and easy to switch from Cycles to Eevee. See this Evermotion article migrating from cycles to eevee. I would definitely would be interested in learning code wise about Eevee.

Isn’t raw Monte Carlo true/very un-biased? :smiley: I think they use it to create reference targets when they try stuff that introduce bias.

I can understand. Blender is huge, their are physical simulations, image post-processing, Motion analysis, etc. Although the course concentrate on Cycles and the Modifiers, it also learns you to learn and to find information quickly. Even long time devs are regularly confronted to new things, as IT is a very fast moving world. It’s more important to learn how to easily adapt by learning a way of thinking. This course will give you my ways of finding and learning what I don’t know too.

Maybe I’ll make advanced/specific courses for other parts later if their is enough demand. But Eevee I think is the hardest and most boring part of the code, because it’s directly dependant on all the GPUs from intel, amd and nvidia in the last 8 years and it’s dozens of driver versions per hardware that have to be supported (laptop users for example can’t always update properly and want it also working), multiplied by the number of OSs Blender supports. Believe me, it’s not just about having good ideas in this part of the code, it’s about workarounding/pilling exceptions. You get hardlocks, blue screens, because of bad drivers etc. To start, it’s better to touch other parts.

2 Likes

It is always biased, because:

  • compute precision pro bounce is always finite, it can only be good enough but not exact.
  • the number of bounces will always be limited, look at SSS or volumes, try to render a scene with volume bounces over 2 and try to make it converge. Even without volumes, look at the Blender Foundation demo files. Even Gooseberry that tried to be as realistic looking as possible is limited to 2 bounces to render it in a realistic amount of time and they already have pretty good render farms.
  • the data you use for the textures is a very rough approximation. It has a white balance applied, that was dependant on the lightning you had/used to capture, the hardware itself also has it’s own response. To have your rendering non-biased, you should use spectral textures captured with a light source that has all of the visible part of the human eye’s response. It should also be a laser scan that has a resolution good enough to have the real roughness, not a faked one computed from the albedo. And you would of course need a screen capable of displaying this full range of light. Even then, it would be at best reproduce what you can see so it would be non-biased to you(with one frame per year or something like that), but it still would be an approximation based on your way of seeing.
  • I don’t even speak about geometry. No material in the real world is 100% diffuse and perfectly flat. So you you always capture with self-shading, reflections, etc… There are tricks to compensate those to only get the one computed at render time, but it’s also tricked and biased. You never get the real microsurface structure. Substances/procedural materials can avoid some of this, but also use tricks to compute fast.

I could continue the list, the point in the end is: make pictures that at least you and your audience like and you are right.

3 Likes

Non-biased only means you don’t introduce bias in the random sampling, nothing more. And you can get Cycles very close I believe (turn off MIS, lots of bounces in normal cases is enough, no clamping or light sampling threshold obviously).

Spectral rendering or how accurate material definitions are can make the renderer more accurate, but as far as I understand, it doesn’t have anything to do with sampling bias.

I think rendertime is 2min. My email is [email protected]

better send your email as PM…

1 Like

I say about 4m35s!

So the answer is 12min13 as I used 4 times the original resolution (2800x2100) of the Evermotion scene. At the original (1400x1050) resolution, it renders indeed in 2min55.
So the winners are @Lumpengnom, @Dragon.Studio and @Komposthaufen and they get one year for free per email.

first update - Subdivision modifier now works properly on Linux. You can download the new version from the product page.

Yay, cool I won!
I allready bought the course. Do I get anything additionally with the one year of updates or is that included in the course anyways?

At the beginning, it’s the same, but the builds will be updated weekly here. In the course, their is one build offered for quick testing and comparison with the builds you make, but you update the builds yourself as their will be as many versions of Blender as their are students. Some will add more modifiers, some will add new features to Cycles, some will build upon 2.7 other upon fracture modifier, etc…
So it adds some convenience to stay up to date with fast rendering.

1 Like

bliblubli, your claim that unbiased rendering does not exist is actually quite disturbing – given that you are coding and selling a path tracer. I believe you and your software do a good job. Just please do not disregard statistics theory.

I guess he meant more something like: perfection doesn’t exist, do something that people like, biased or not.

Would it be possible to support RTX cards?