All in all: polish existing stuff, and don’t make to much new fancy features.
Display procedural textures
Stencil/Buffershadows from all lamps in viewport for better feedback on lighting setups.
Better GLSL feedback on materials - although Cycles RT is nice you can’t preview an animation in it for instance.
Improve brush management
->Viewport -> GLSL shaders for better sculpting feedback… e.g. cavity shaders
Bmesh polygons and quads handleable as triangles with turnable diagonals while Blender treats them as quads or polygons.
Knive, Cut, Bevel, Inset tools
OpenCL + Bullet integration into the viewport, no more baking for ages and no more recording IPOs from the game engine, which has become a neglected stepchild anyways. Springs, gravity etc. for Armatures
OpenCL integration for smoke, cloth and liquid.
OpenCL integration for particles and seperation+fixing of dynamic Hair
Paint layers (like MikeW’s addon)
preview of a texture brush on the mesh and the ability to rotate it and set the center
Improve texture management. It’s horrible not to be able to load a texture as brush directly in paint mode and discard it once you’re done.
Proper Lightmap baking
Well, manpower for Cycles to get it feature complete and optimize the performance.
“Liaison GSoCodie” to sit between BF + external renderer devs and work out a API URD.
A terrain modifier creating a triangulated hexagonal mesh, with a voronoi biome/topology generation algorithm with perlin edgedisplacement to create small islands or the usual fractal methods + heightmap paint in the viewport to create landscapes, moisture function, errosion, add water, dry and most important an adaptive LOD controller where you can drive the LOD e.g. with the camera distance (->replace geometry with pre-baked normalmaps) and control the LOD for the texturemaps.
I think that’d be a very intresting one for Students, looked into myself already and can provide some links and papers.
Maybe someone’s intrested in doing something like Photosculpt or Shadermap pro, to generate normal maps and/or geometry from image references.
Another think I looked into myself already, but hard to find informations, it almost seems as if all the normal map generation algorithms are based on edge detection and the rest are undocumented, made up guesstimation algorithms of some sort.
Someone finish/improve the network render, also incorporating distributed multiGPU cycles and distributed tile rendering in the original planned fashion, so that F12 renderings are done distributed on the slaves in tiles and the master you’re working on reports once the rendering is ready and pops it up on confirmation.
Something like Allegorithmics Substances, where you can store predefined procedural materials, like soil for instance and have sliders to add water or snow, and save out tilable diffuse, normal, displacement, specular and occlusion maps.
The material node system is already there to achieve it.
A server system, maybe SQL based to setup a local service/daemon to handle a libary for models, materials, textures
Node based drivers