HWREN??

I did a google search for “Open Source” “Hardware renderer” and stumbled on a sourceforge project called hwren.

HWREN will be a hardware assisted renderer, it was made to address the slowness of rendering photoreal computer graphics yet still keeping most the quality. It is going to be integrated with Blender3D.

http://sourceforge.net/projects/hwren/

It’s listed as being a Linux program, but it might be compilable on multiple platforms? Currently the project is in the planning stage, so no code has been written yet. It will be interesting to see how this project progresses. It also looks like there’s a donations page :slight_smile:

It lists the developer as Rob, with a username of r45512. Did a search on ElYsiun and blender.org and didn’t come up with anything (for either his username or the project name).

I found this on the progress page

Internal Stuff
Actual Renderer -> 20%
Script Interpreter
Support for Older Cards -> Dunno Yet
GUI -> 20%

Looks like a good idea…

I’m r45512…

Yes… I stopped working on it for a while to go off and look for some more algorithms… I’m not sure how fast it will actually be though… raytraceing on a GPU I dont imagine will be terribly fast though I may be wrong. :confused:

I’ll have to look into it more… but if there are any other devs here you’re welcome to help I’ll add you to the project just give me your sourceforge name.

I hang out in blenderchat a lot, you’ll find me as f00f or f00fbug_

if you took this http://graphics.stanford.edu/papers/tpurcell_thesis/ to the http://www.gpgpu.org forums and told them about the project i’m sure you would find lots of help. as of me i barely know my way around python so i suppose i’m no help :frowning:

i would however love to see something like that integrated into blender someday although the roadmap for blender specifies that when open gl 2.0 is out blender will render through it so i’m not sure what effect your renderer will have.

Cool to see your still working on it (if only mentally).

and Darqus… how about those plans you had… I looked up your posts to reread them recently and still think it’s a neat idea.

wasn’t there something about an opencore thing on sourceforge as well??

i ditched it, no one thought it was a good idea for blender.

modern gpu’s (nvidias 6800 series and ati’s x800) are indeed powerful enough to do hardware renders and since blenders roadmap specifies that open gl 2.0 will be used for rendering somewhere around ver 3.0 it’s a snowball in motion.

allthough we all would like to see some progress now, devellopment cannot start untill the open dl 2.0 spec is finished.

when that happens? who can say… %|

right.

I’ve been bugging people about this stuff quite a bit the last few days and the way forward seems set.

2 steps for making blender reach the next level… rendertime wise.

  1. hardware acceleration, which will be opengl 2.0 not a sepparate hardware acceleration card… though it would be nice.

  2. network rendering.

Wow talk of Open GL2 is everywhere now, on here and Blender.org.
Have i missed sumthing. I cant wait for a hardware accelerated version of the Blender Renderer, but i can’t see it happening any time soon. Unless the idea of paying coders takes off, and someones pays for it. I hope that doesnt happen, im against the idea of paying the developers, but if its going ahead i think that businesses who use blender commercially should add to the pot because they make money from it. Maybe they could offer a bounty on a hardware accelerated version of blender.

L13… how could you be against it??? there’s lots of people coding n improving blender in their own time now… if they can make some money in the process all the better. If you’re worried that a few people with money to spend will end up shaping the future of blender, then don’t worry… there’s some very dedicated people out here that look after their baby.

The talk is there cause a couple people like darqus and myself have been thinking quite a bit about such options… and I’m afraid I’ve been overposting a bit on the subject hehe.

In the end… it’ll take over a year at least I think (that’s my very best personal guestimation).