Blender accepted into GSOC 2011


can you clarify? I tried x-mirror and topology mirror and neither of them did it. Using a mirror modifier I can sorta acomplish it, but there are lots of times where you will have a symmetrical mesh but not have a mirror modifier.


the polishing of tools is so that blender becomes ‘best in class’ at particular tasks - I want professional users to go ‘I need to use blender to sculpt/texture paint/uv unwrap/retopologize/model because it is the fastest most efficient tool’. Right now we are ‘ok’ at all of those tasks, but with work over the summer I think we could become one of the top two or three tools for each category.

with textureing there is not a lot of things to do to get it close to bodypaint and Mari.

you just need a picture manager. so that you can throw images on the model very fast.just a simple window. and then you need a plane projection system where you bake the picture on the model.

x-mirror works for me. everything i do on the left side changes on the right side.i dont know what topology mirror does.
about mirror seam selecting. i first select seams on the left side. then i select all seams and go under select/mirror. and the selection jumps on the left side. i again make seams. very fast.

in the official beta version mirrored UV layours works. in the custom builds it doesnt and gives an error. i hope they will fix this.

select mirror does work, it is a bit clumsy though

I agree with proposals to accelerate blender’s workflow.
UV Unwrapping can be done in Blender.
You can duplicate a unwrapped part of a mesh like an arm and join only mirrored this part to keep same UVs.
I am used to do this and to use mirror modifier.
With this, “copy mirrored UV coords” operator and snap in Image Editor satisfies me.

I don’t use X-mirror because it does not select vertices, edges, faces. So, extrusions, subdivisions can not be mirrored.
But if symmetrical tag is enable for seam, I suppose that it could be also available for sharp, crease and bevel edges.
Even for me, symmetrical improvements have some interest.

i dont understand this. can you explain it more? thanks

Small features: a stream of conscience!

Paint tools: a stencil tool that allows you to position scale and rotate an image in the viewport that can be painted down…

(i know the uv method, but it’s a little clunky’)

anchor, dragdot and rake in paint mode! (Though I guess that because all this is in the ptex branch it’s not meaty enough for a gsoc)

roto tools in screenspace in the 2d/3d windows (i know the addon method, but it’s a little clunky) (including expansion, contraction, blurring per edit point etc…

move/ scale/ rotate on canvas for 2d editors (sequencer, compositor) and manipulator widgets for the uv view!

easier workflow for 2d elements in 3d space

+1 for more draw options to make retopo easier (the tools for manual retopo are already quite nice but visibility is a problem) good “auto” retopo is also a nice target!

I’d love to see some “motion estimation” stuff in the sequencer/compositor (calculating motion vectors) see the foundry’s motion blur, kronos etc

Sequencer: would be nice to have a “clip” view /bin combo for easy 3 point style editing. (I know the workarounds to get stuff a bit like this, but again a little clunky…

UI: I’d love to see the window manager improved to allow tabs (see nuke for an excellent example of how this could be) I find it frustrating when animating to switch between dopesheet, action, NLE, and fcurve editors and certainly don’t want lots of layouts for this…

or I could have multiple tabs of teh 3d view for each of the scenes in my blend file…

support for “openFX” plugin standard… should make blender sequencer more attractive for plug in developers and end users get access to free, open source and commercial plugins !

Particles: Halo textures could do with an overhall! instance objects/billboards should be able to have uvs after instancing ...

particle age should be able to drive other particle parameters: (i know they can drive textures these days, but i mean more than that…) eg size, “random” motion etc…

spline/strand paths for dynamic particles: the current system (oinly just added) is “dab” based
Bring back “reactor” particles!

In general the “high end” of particle rendering is quite good (volumetrics, smoke…) but the “old school” cheap and dirty stuff could be much more usable
(a HUGE improvement would be the uvs after instancing thing I mentioned earlier… but also using movies(or sequences) on billboards would be very desirable (the animated texture uvsplit method is fine for games, but not very friendly for fx work!.

I’ve been meaning to set aside some time to do a bit of a texture tool wishlist, but Michael W has posted some very good suggestions already.

I still do most of my texture painting in 2d regardless of what programs I have access too. This will probably change over time I’m sure, but painting out uv seams in 3d is obviously a major advantage. I’d need to thresh out what exactly I can and can’t do right now (haven’t used Projection Paint tools much since Durian) to make requests a bit more straight forward. However, loading two or more textures at the same time on a single mesh (ie head colour map, torso colour map, limbs colour map) and then being able to paint over the seams across those multiple images, ideally with a heal tool working like the clone tool would just be fantastic. Currently I’ve been trying attaching all the textures together into a massive image just to paint seams out, but the responsiveness as a workflow gets quite poor quite quickly if your textures end up 8-12k wide…

Again, I’ll try and look into the specifics of that soon to be a bit more helpful. :slight_smile:

Make seams on left arm.
Unwrap left arm.
Delete right arm faces.
Select left arm faces.
Move 3D cursor at the middle of mesh. Choose 3D cursor as pivot point.
Duplicate left arm faces with shift D.
Mirror Faces with Ctrl M.
Select all. Remove Doubles.

Now, you have a right arm with same UV Unwrapping using same pixels of textures.

I like most of the painting ideas here. Layers would be a good benefit as well.
I would love to use metaballs for particles…
I always want to sculpt across the seam of two joined planes, but because there is a seam it does not interpolate correctly…especialy smooth.I use these for multi terrain for games.

The UV unwrapping and texturing would be a great improvement for blender, for texturing zBrush has got it spot on in my opinion, you can import the image into the spotlight application, rotate scale and warp it, even control alpha and more before you even start painting textures or displacement using it. I personally think one of the biggest improvement blender could make is to have everything editable in the Node window. The modifier list is fine, but if extrusions, verts and faces, along with particles textures and dynamics where all interlinked in the one window, the power of blender would be… I cant even think of a word that would encompass it, plus it would help those of us, me included that suck at programming.

The number one thing for me would be an inbuilt camera tracker. Maybe libmv integration… This would be literally the single best thing that could be integrated into Blender.

+1 for Sculpting and Texture painting improvements, Unlimited clay implementation, texture painting performance and layer improvements would be fantastic.

If i had to suggest an improvement… it would be with the render engine, either that or implementation of yaferay

You are so right. One thing to mention, though: You currently can use particle age to drive:

*Halo Texture properties (through a texture like a simple blend for changing the color)
*Color of volumetrics (using a point density texture)
*A few material properties like transperancy when using the explode modifier (through a UV texture)

You only have 3 months… may as well focus in on one thing.
I say…



The rig tweaking tools are essential…
and tracking integration …now that would make a blender a absolute complete package…

Through all the workflow texturing feels the most clumsy, improving it would be great.

A big thing I think the GE can use are real-time decals, ( texture projection? )

Instance object visualisation is a link to another object’s data like an instance group empty. You can only modify UVs of original object.
You can make sticky coordinates or use an UVproject modifier with a particle instance modifier.
But UVs for a varying number of faces through time seems to be more difficult than UVs for Array,Solidify or Bevel modifiers.
I think that probably using other mapping coordinates solve most of cases.

strand/particle coordinates allows to modify through particle age size,density,velocity,force fields,etc… All sliders of Influence panel of Particle textures.

Jahka wanted to be able to manage particles animation as action in NLA.
Phonybone’s branch should reimplement “reactor” particles in his branch.
I agree it will be a real lack for particle emission in future 2.57.
But particles coodinates are better than nothing.
Maybe, this todo could be done before 2.6X changes on compositor.
Maybe, Jahka’s plans are to complicated for a GSOC.