Page 1 of 2 12 LastLast
Results 1 to 20 of 28
  1. #1
    Member BlenderEi's Avatar
    Join Date
    Jan 2011
    Location
    Nuremberg, Germany
    Posts
    136

    Point Cloud support in Blender: What's already there and what does the community need

    Hello Blenderheads,

    atm I am doing some research on how well Blender handles Point Clouds.

    What are Point Clouds?
    Well, simply put: Many (I mean really a LOT, like millions of) points in 3d space that can also contain color information. They can be gathered via Laser Scanning Technology, Photogrammetry (like Blenders Motion Tracking), Depth Cameras (like e.g. Kinect) and other methods.

    How well is this integrated in other software?
    I recently saw this great video from another software. Unfortunately it is a competitor of Blender. Take a look at this realtime visualization of Point Clouds and their workflow (for comparison with our system):




    How well is it integrated in Blender?
    Our Blender Developers are awesome. They have implemented (and still do) several algorithms to speed up the display in the 3D viewport (most recently with OpenSubdiv, afaik). As for my latest research, it seems that Blender has serious problems displaying very dense geometry - please correct me if I am wrong with anything. Furthermore it is not possible to display vertices with color information. I've prepared a simple example by painting in suzanne with vertex colors and trying to display the colored point cloud, without any success.

    Blender_VertexColors.jpg

    Neither do the vertex colors display in the viewport, nor do they render as Halo's with the texture option applied (Voxel data). Other users on this forum experienced similar problems and didn't get a solution yet:


    Because I am trying to find out if this is a behaviour the user base expects or if we can somehow improve upon, I'd like to ask all of you:

    What needs to be done?
    -Would you like to have a generally improved Point Cloud support in Blender (workflow, speed, display)?
    -Are you happy with the current situation?
    -What areas should be improved (importers, tools, etc.)?
    -What would you personally like to have implemented (certain feature, performance, etc.)?
    -Would you like to see Blender Development documented via (video) tutorials?

    The reason I am asking those questions is that I really would like to give something back to Blender and it's community. Since I am about to start my bachelor thesis, it'd be great if I could try to use a Blender specific topic for it. I am also planning on documenting everything quite accurate, so that Blender Development would be easier for beginners through a real world example on how to improve a really great graphic suite with another feature, thus making Blender even more awesome.

    I am happy about every single comment (unless it is destructive) on this topic!

    Looking forward, cheers!
    WATCH OUT: A German has posted that!!! Spelling mistakes included for free! Thanks for reading!
    BlenderEi provides a GERMAN VIDEOTUTORIAL SERIES for Blender 2.5+!
    You can watch it here, on Vimeo or on Youtube.



  2. #2
    Member Spirou4D's Avatar
    Join Date
    Jul 2010
    Location
    Lille, France
    Posts
    1,286
    Hi BlenderEi,
    A great idea you have and do you know that:

    By Danyang Yi @lias "nirenyang"
    Import Leica Cyclone PTX
    https://developer.blender.org/T40178

    By "hans_p_g_cg"
    Pointcloudskinner add-on:
    http://sourceforge.net/projects/pointcloudskin/
    LINK
    http://hanspg.web.fc2.com/Pages/csv_...udSkinner.html

    By "Re.je"
    http://re.je/notes/2014/rendering-po...ds-in-blender/

    By Aurel Wildfellner:
    Import-Export PCD

    I'm very interesting about your proposal.
    good Luck.
    Spirou4D
    Last edited by Spirou4D; 23-Sep-14 at 07:08.
    Words are Images that loosed Light!
    Linux Mint 18.2 Saunya - Bi-Quad Intel 2.66Ghz x64Bytes - Nvidia OpenGL GT 630 4Go-vram



  3. #3
    It would be great if Blender had some useful support for point clouds. The fundamental problem currently is that none of Blender's data structures are appropriate for storing point cloud data. The long awaited particle system refactor would go a long way to addressing that, but I haven't heard anything about that project in a long time so hopefully it's still progressing. Until Blender has an appropriate data structure I don't know that it is very useful to try and work on things like tools or importers because their utility will be severely limited for any production scale problems.

    These issues were discussed a bit in this thread from last spring:

    http://blenderartists.org/forum/show...-Color-support



  4. #4
    Member 3d solar system builder's Avatar
    Join Date
    Mar 2012
    Location
    texas
    Posts
    1,902
    Develop particle system refractor.Then points clouds for the game enngine.I would love to see that.



  5. #5
    Member BlenderEi's Avatar
    Join Date
    Jan 2011
    Location
    Nuremberg, Germany
    Posts
    136
    Hey guys,

    thank you for your quick response.

    Originally Posted by Spirou4D View Post
    Hi BlenderEi,
    A great idea you have and do you know that:

    [...]

    I'm very interesting about your proposal.
    good Luck.
    Spirou4D
    I knew a few links already, but some have been new to me. It pretty much just tells me, that there are workarounds and some tools, but still no good solution to handle this problem correctly. Thank you for pointing that out.

    Originally Posted by jedfrechette View Post
    [...] The long awaited particle system refactor would go a long way to addressing that, but I haven't heard anything about that project in a long time so hopefully it's still progressing. [...]
    Ah, I think you mean the particle nodes by Lukas Tönne, right:
    http://phonybone.planetblender.org/
    I also donated to this project some years ago, I didn't think of using the particle system for my kind of scenario, but it makes sense I think when I start developing the point cloud visualization, I'll contact him and ask for some info about his work regarding the future use for point clouds. He might also be willing to give a short interview about his work, so that I can inform you about the status on my blog (which I will announce here as soon as I start to work).

    Originally Posted by jedfrechette View Post
    [...] Until Blender has an appropriate data structure I don't know that it is very useful to try and work on things like tools or importers because their utility will be severely limited for any production scale problems. [...]
    I totally agree with you! Of course the basics need to be done right in the first place, before thinking about any importers and tools.

    Originally Posted by 3d solar system builder View Post
    Develop particle system refractor.Then points clouds for the game enngine.I would love to see that.
    Great, so far I see that the blender community would really love to see this feature implemented. It seems that this would benefit Blender in a quite good way.
    Now let me investigate this a little more. When I decide to start officially with the work, I'll keep you informed with articles and videos.

    Still, don't hesitate to add your opinion while I continue doing my research.

    Cheers
    WATCH OUT: A German has posted that!!! Spelling mistakes included for free! Thanks for reading!
    BlenderEi provides a GERMAN VIDEOTUTORIAL SERIES for Blender 2.5+!
    You can watch it here, on Vimeo or on Youtube.



  6. #6
    Member
    Join Date
    Sep 2012
    Posts
    3,230
    Could then be possible to import point cloud, do re-topo and transfer point color to UV? Someday in the future?
    Or even make a point cloud from video/image sequence tracking?
    Thanks for all.



  7. #7
    Member SterlingRoth's Avatar
    Join Date
    Mar 2006
    Location
    Portland, OR
    Posts
    2,162
    Originally Posted by burnin View Post
    Could then be possible to import point cloud, do re-topo and transfer point color to UV? Someday in the future?
    Or even make a point cloud from video/image sequence tracking?
    Thanks for all.
    Look into visualSFM and meshlab for this. VisualSFM can generate a pointcloud from an image sequence, and meshlab can skin and transfer color data into a useable mesh.



  8. #8
    Member BluePrintRandom's Avatar
    Join Date
    Jul 2008
    Location
    NoCal Usa
    Posts
    18,816
    Google tango has these in it's systems, but I don't know if it's open source or not?
    Break it and remake it - Wrectified
    If you cut off a head, the hydra grows back two.
    "headless upbge"



  9. #9
    Member BlenderEi's Avatar
    Join Date
    Jan 2011
    Location
    Nuremberg, Germany
    Posts
    136
    Hi there,

    Originally Posted by BluePrintRandom View Post
    Google tango has these in it's systems, but I don't know if it's open source or not?
    Well since "Project Tango" is more a device than a software, they reveal their API for you to use with C++, Java and the Unity Game Engine:
    Originally Posted by https://www.google.com/atap/projecttango/#devices
    They run Android and include development APIs to provide position, orientation, and depth data to standard Android applications written in Java, C/C++, as well as the Unity Game Engine. These early prototypes, algorithms, and APIs are still in active development. So, these experimental devices are intended only for the adventurous and are not a final shipping product.
    The moment Blender will be available on mobile devices, this would certainly be a nice addition. But nothing for now, in my view.
    Originally Posted by burnin View Post
    Could then be possible to import point cloud, do re-topo and transfer point color to UV? Someday in the future?
    That's exactly why I started this thread and ask you, as the Blender Community, what you would like to have implemented. In the end I want to develop something the user base really needs! If this is what you want, I'll take this into account.

    Of course from my view it would make sense to be able to import any point cloud data (which requires proper data types in Blender), doing Re-Topo (which might require you to select the individual vertices - which Particles do not provide, imho) and transfering point color to UV (what looks like we need some kind of baking from PointClouds to Meshes).
    At the moment VisualSFM is definitely the way to go! Though, you could also try it with the built-in motion tracker (see Sebastians Tutorial: http://vimeo.com/87658924) for simple reconstructions, because it wouldn't give you the vertex colors. But in combination this could be the start of a motion-tracking-based 3D Reconstruction feature which creates those special Point Cloud datatypes.
    I don't know - this is something the blender community has to tell their developers whether they need the feature or not!

    Thank you for further opinions about the topic.
    Stay tuned and continue the discussion.
    WATCH OUT: A German has posted that!!! Spelling mistakes included for free! Thanks for reading!
    BlenderEi provides a GERMAN VIDEOTUTORIAL SERIES for Blender 2.5+!
    You can watch it here, on Vimeo or on Youtube.



  10. #10
    Member
    Join Date
    Jan 2012
    Posts
    1,003
    Hi BlenderEi,

    Would be really helpfull to get pointcloud support You are a programmer and you want to implement it if I'm correct? Being able to render those point in Cycles would be nice then.



  11. #11
    Euclideon is a company that has been developing software so optimized that you can run nearly photo-realistic graphics in realtime. They are scanning entire environments, then importing the pointcloud into their software. The software uses a search algorithm to figure out what points to render on the screen instead of all of them, and the points stream from the hard drive or the internet. This means that machines with even slow dual-core processors and 512 MB RAM graphics cards can run their software. It's incredible. Check it out for yourself: https://www.youtube.com/user/EuclideonOfficial

    I would love to see something like this implemented into Blender, but in my opinion it would require so much programming that even if Blender Foundation decided to do it, it would take years to implement it. Euclideon has been actively coding this since 2011.
    My rig:

    Asus Z-87 PLUS Motherboard | 16 GB Ripjaws | Zotac GTX 980 4GB | Intel Core i7 4790K with Corsair H100i



  12. #12
    Originally Posted by BlenderEi View Post
    That's exactly why I started this thread and ask you, as the Blender Community, what you would like to have implemented. In the end I want to develop something the user base really needs!
    Rather than look specifically at what the Blender Community needs/wants, it might be better to look at how the industry as a whole is using this sort of data and what they want. I don't say this to be dismissive, simply to acknowledge that the lack of any existing ability to work with this type of data coupled with Blender generally not being part of pipelines where the data is being used means that the community probably doesn't have much experience with it. My response below, therefor, is written as someone who does lidar scanning professionally for both feature film VFX and a variety of other applications rather than as a Blender user.

    There are 3 main areas where point cloud functionality could overlap with Blender's existing capabilities.

    1. Visualization

    This is primarily what the 3DS Max video in the OP showed, essentially, rendering stills and animations that include point clouds as a component of the scene. Although people have done a few experiments, e. g. [1], this is generally not done at all in the film industry. Surveying and engineering companies, however, do this a alot, for example [2] and countless other similar videos on YouTube. In many cases they are doing these visualizations in specialized software like Arena 4D, Pointtools, and the various native point cloud packages. This is also functionality that has been requested in open source point cloud software [3]. Being able to do this in Blender would offer much of the flexibility provided by the Maya+Partio+Arnold combination Luma demoed, which can be a huge advantage over the standalone applications.

    2. Camera Tracking

    This is probably the most common way lidar point clouds are used for VFX. If you have a survey of your set you can assign known 3D coordinates to the 2D tracking markers on your footage and use them to constrain the tracking solution. Adding additional constraints to the solution allows you to get a better solve more quickly. Using lidar to do the set survey is just an extension of the way sets have been surveyed for years using tape measures and levels, total stations etc. In practice, lidar data is often meshed before being used in this way, however, being able to work with point clouds directly in the tracking software offers a lot of potential efficiency advantages. Unfortunately, Blender's camera tracker doesn't support survey data of any kind.

    3. Set Reconstruction/Extension

    This is closely linked to 2, once you have the camera track locked you can start rebuilding the set and extending it as necessary from your survey data. Again this is often done using meshed lidar data. For simple geometry it might be desirable to do the modeling directly from the point cloud, however, you would still need to mesh the point cloud if you want to be able bake a displacement map from the high-res geometry. For more complex organic shapes, such as rubble piles, using a traditional retopo workflow on a mesh is probably more straightforward than trying to reimplement that work flow using a point cloud. Of course, the other big roadblock here is Blender's view port performance.


    If I was picking a project, I would probably pursue either camera tracking survey support (independent of point cloud support) or point cloud visualization (Point 1). Both of these would be very useful to have, however, they would be useful to different groups of users. Adding the ability for the camera tracker to use survey data would be an extremely useful addition to the Blender VFX toolkit, even for users who might not have lidar scans of their sets. In contrast, adding point cloud visualization would primarily be useful for the aforementioned surveyors, architects, engineers, etc. who might not be using Blender at all right now. Visualization is also something that has the potential to make significant use of a reworked particle system, if that ever makes it in to trunk, so it might be better to wait on.

    If you need more specific advice on the lidar side of things, sample data, etc. I would be happy to help as much as I can.

    Originally Posted by brandx96 View Post
    I would love to see something like this [Euclideon tech] implemented into Blender, but in my opinion it would require so much programming that even if Blender Foundation decided to do it, it would take years to implement it. Euclideon has been actively coding this since 2011.
    I think you're giving Euclideon to much credit. What they've been showing recently doesn't seem to have fundamentally advanced since the first demo they showed 3 years ago. The main difference seems to be a change in their business model to try and target the geospatial industry more. I think that's smart, they have a nice renderer that works well with real-world point clouds if you can accept its limitations, i.e. no dynamic lighting, no deformation, etc. It's not nearly as special as their hype would have you to believe though. I have no doubt a smart developer could write something very similar for Blender without an unreasonable amount of effort. Even without any clever indexing tricks it is pretty trivial to write a naive OpenGL point render that easily handles hundreds of millions of points on modern hardware, and people have been writing sparse voxel octree renderers similar to what Euclideon has been showing for decades.


    [1] http://blog.faro.com/2013/07/25/luma...laser-scanner/
    [2] http://youtu.be/0V2ZYKIwCXc
    [3] http://www.danielgm.net/cc/forum/vie...php?f=14&t=513



  13. #13
    Member
    Join Date
    Jan 2012
    Posts
    1,003
    Didn't see your proposal of recording your coding. It's a great Idea I think, maybe some Blender devs could do it during Gooseberry. Having tutorials for both artists and coders who want to contribute would be great So please do it!



  14. #14
    Member BlenderEi's Avatar
    Join Date
    Jan 2011
    Location
    Nuremberg, Germany
    Posts
    136
    Hi all,

    sorry for my late answer - I was busy with the project specification and still am.
    Great to get so much feedback from all of you, thanks!

    Originally Posted by matali View Post
    Would be really helpfull to get pointcloud support You are a programmer and you want to implement it if I'm correct? Being able to render those point in Cycles would be nice then.
    Yes, I am a programmer (though, not a highly experienced Blender developer - yet!) and do want to implement a better support for point clouds. Nevertheless, rendering them in Cycles is more a topic for the Cycles module developers. Still, ofc, I have a big interest in providing the necessary data structures to be used in Cycles.

    Originally Posted by matali View Post
    Didn't see your proposal of recording your coding. It's a great Idea I think, maybe some Blender devs could do it during Gooseberry. Having tutorials for both artists and coders who want to contribute would be great So please do it!
    I will do my best!
    Yeah, it would be great for Project Gooseberry as well! I commented on the blog post by Elysia (the new Gooseberry reporter: http://gooseberry.blender.org/introd...erry-reporter/), to ask her if she could provide some insight into Blender Development from within the Blender Institute. Maybe there will be some good material, too.

    Originally Posted by brandx96 View Post
    Euclideon is a company that has been developing software so optimized that you can run nearly photo-realistic graphics in realtime. [...]

    I would love to see something like this implemented into Blender, but in my opinion it would require so much programming that even if Blender Foundation decided to do it, it would take years to implement it. Euclideon has been actively coding this since 2011.
    I am following the development of Euclideon since the beginning and find it interesting. Although it is not on my radar right now, the basic functionality I plan to implement in Blender would - ideally - be a necessity for such features. At least that would be my goal.

    Originally Posted by jedfrechette View Post
    Rather than look specifically at what the Blender Community needs/wants, it might be better to look at how the industry as a whole is using this sort of data and what they want. [...]
    I totally agree with you! This is the reason why I was looking at Autodesk for example and what they have implemented already. I will continue doing the research of available solutions, but still would love to adhere to whishes from the Blender community to increase the community value.

    Originally Posted by jedfrechette View Post
    If I was picking a project, I would probably pursue either camera tracking survey support (independent of point cloud support) or point cloud visualization (Point 1). Both of these would be very useful to have, however, they would be useful to different groups of users. Adding the ability for the camera tracker to use survey data would be an extremely useful addition to the Blender VFX toolkit, even for users who might not have lidar scans of their sets. In contrast, adding point cloud visualization would primarily be useful for the aforementioned surveyors, architects, engineers, etc. who might not be using Blender at all right now. Visualization is also something that has the potential to make significant use of a reworked particle system, if that ever makes it in to trunk, so it might be better to wait on.
    Great - you have some interesting points about the usage of the data. All of those workfields are interesting to look inside. Due to the constraint of my research question (which is of historic / engineering nature), I will focus my work on the visualisation task. I've already contacted Lukas Tönne (who was developing Particle Nodes). Unfortunately he didn't write back yet and it is stupid that I couldn't make it to the Blender Conference last weekend. This would definitely narrow down my future development roadmap to some specific topics. And I would meet Lukas in person. Now I have to somehow cope with that. Of course I will (have to) find a way to do that!

    Originally Posted by jedfrechette View Post
    If you need more specific advice on the lidar side of things, sample data, etc. I would be happy to help as much as I can.
    Thank you for your great offer. Using LIDAR data is exactly the use case I will be faced with while doing my thesis. For that I'll have a LIDAR scanning device at my disposal. So, I'll be able to gather my own test data. It's awesome that you are available for some specific questions about the scanning process in general. I might get back to it when problems arise. As I understand your post, you are using Point Clouds for VFX and rather not for visualisation. I think if I could provide a good visualisation of point clouds in the Blender viewport, It might be an aid for VFX also. But sure, registering those two types of 3D point clouds (from Tracking and from LIDAR scans) would be great (and there are some papers describing how to do that).

    So, at the time of this writing my blog skeleton is already set up on my webserver. But I will publish it here until I have something to show (not some "lorem ipsum" placeholder text). After watching some live streams from the Blender Conference 2014, I am also thinking about starting tinkering with the newly open sourced "OpenVDB", which is used in Houdini and Realflow for the simulation tasks. If it has proper point cloud support, it might be a candidate for including it into Blender and getting the basics done for not only performant point clouds, but also a better simulation system in Blender. But this is something I'll have to find out through research first.

    Now back to some more specification / research work...
    WATCH OUT: A German has posted that!!! Spelling mistakes included for free! Thanks for reading!
    BlenderEi provides a GERMAN VIDEOTUTORIAL SERIES for Blender 2.5+!
    You can watch it here, on Vimeo or on Youtube.



  15. #15
    Originally Posted by BlenderEi View Post
    I will focus my work on the visualisation task.
    Sounds like a good plan, I think even having just basic point cloud rendering in Blender would be really useful for a lot of people. Just having access to existing tools form animation and camera controls would be a huge win over many of the other offerings out there. I hope you're able to get some feedback from the core team.

    Originally Posted by BlenderEi View Post
    I am also thinking about starting tinkering with the newly open sourced "OpenVDB", which is used in Houdini and Realflow for the simulation tasks. If it has proper point cloud support
    I don't know much about it but one of the big new features in the upcoming 3.0 release was support for large particle data sets. I think the 3.0 beta was released within the last couple weeks so definitely worth a look.



  16. #16
    Member BlenderEi's Avatar
    Join Date
    Jan 2011
    Location
    Nuremberg, Germany
    Posts
    136
    Hi there,

    Originally Posted by jedfrechette View Post
    Sounds like a good plan, I think even having just basic point cloud rendering in Blender would be really useful for a lot of people. Just having access to existing tools form animation and camera controls would be a huge win over many of the other offerings out there. I hope you're able to get some feedback from the core team.
    Hmm, still no answer from Lukas Tönne (regarding the particle approach). Of course, it is quite risky to jump into core Blender development without any prior knowledge and support, so it turns out that I'll choose a wiser approach, imho, which I will describe below.

    Originally Posted by jedfrechette View Post
    I don't know much about it but one of the big new features in the upcoming 3.0 release was support for large particle data sets. I think the 3.0 beta was released within the last couple weeks so definitely worth a look.
    Over the last weeks my research was going on and the title of my bachelor thesis happend to change a bit. It is now "Visualization of laser scanner point clouds as 3D panoramas". So the basics remain the same, but it is getting a bit more math heavy. That's why I will attempt to write some sort of "Converter" from point cloud formats to a meshed Blender format. For me personally, this is like a little puzzle piece being tested thoroughly and - upon success - aimed to be integrated into Blender eventually (I will put a strong focus on that during my research).

    My current plan is something like the following:

    -Import raw laser data to proprietry software to do registration and export to point cloud format (preferrably one that is better documented than .xyz ...)
    -Import point cloud format into my software and store the data internally via OpenVDB
    -Do the maths behind meshing, while drawing the progress interactively with OpenGL
    -Create the meshed surface and textures
    -Export into a format that Blender can read (I hope I will get to write a custom .blend file, so that the Blender Importer doesn't lack performance due to Python)

    After that I will do a use case in which I'll reconstruct a historic building facade with Blender to test this approach, which might take up the other 50% of my work.

    Furthermore there is still the plan to publish videos during development and regarding the Blender side of things I hope there is a great use for my beloved Blender community.

    I will officially start my thesis in January and document my progress on this website:
    http://bachelor.kalisz.co/

    That's it for this small update. Critique, suggestions and comments are highly appreciated!

    Best regards,

    Adam
    WATCH OUT: A German has posted that!!! Spelling mistakes included for free! Thanks for reading!
    BlenderEi provides a GERMAN VIDEOTUTORIAL SERIES for Blender 2.5+!
    You can watch it here, on Vimeo or on Youtube.



  17. #17
    Member BluePrintRandom's Avatar
    Join Date
    Jul 2008
    Location
    NoCal Usa
    Posts
    18,816
    Yay!

    Happy bump!
    Break it and remake it - Wrectified
    If you cut off a head, the hydra grows back two.
    "headless upbge"



  18. #18
    Member
    Join Date
    Jan 2012
    Posts
    1,003
    Maybe you could try to have a look to the OpenVDB implementation of the 2013 GSoC :http://wiki.blender.org/index.php/User:Jehuty/GSoC_2013 OpenVDB can convert point clouds of scan into meshes or Voxeloctrees. I use it in Houdini to visualise 3D scans it's insanely fast and memory efficient.



  19. #19
    Member
    Join Date
    Dec 2012
    Location
    Finland
    Posts
    123
    I completely missed this thread on the first round. Some highly interesting stuff here.

    I'm developing a comprehensive node programming system (Flakes, unrelated to Lukas Tönne's work though I've had some correspondece with him in the past), and I'm facing some of the same issues. Handling large datasets in python isn't actually that big a deal with the right libraries, but there's no way to communicate with Blender's core systems without a custom build. I'm curious, are you aiming to develop something that integrates directly to the blender code, or some manner of extension? It appears to be quite challenging to get anything accepted into the core these days, and extensions are crippled by this communication barrier.

    If you go with the extension route, I'm hoping to pick your brain a bit on this communication issue. Currently my simplest idea would require some (possibly quite minor) changes to core blender, and I've been secretly planning a proposal, which would enable wide variety of high performance third party extensions to work with blender with minimal effort. This approach likely won't work with the likes of openVDB so if that's the way you are going, it's safe to just ignore me. Unless of course you have something much better in mind.

    Ideally I would build a numpy-compatible data interface to all the "large data" systems like meshes, particles, maps and lattices. This could be a lazy structure that merely represents existing data differently, i.e writing a value to one element of the particle location array writes it directly to a particle location. This way it does not consume extra memory and won't need to be enabled explicitly when it's needed. Similarly, updating the entire particle system would involve only passing the array pointer to Blender.

    All low level structures should be as simple as possible. If we have e.g. a list of 3D-points with location, color, time and ID information, we'd split it to 4 different buffers, one for each property. Flakes uses this approach because it makes passing around data convenient, keeps all the computations uniform and hence parallel processing easy and avoids data alignment issues (also good for parallelism). It's also easy to pass this kind of data to other blender extensions.

    Does this make any sense at all? Would this, or or something like it, work for your purposes? Any other ideas? Would be nice to get some third-partiers to collaborate on Blender's extensibility issues.

    Some convenience tech that I recommend:

    Numpy
    -fairly high performance directly within python
    -high interoperability between python modules (well established; kind of lingua franca)
    -convenient conversion to python types where needed
    -works directly with Cython
    -works directly with pyopencl
    -works directly with openVDB

    Cython
    -high performance computation on CPU
    -fairly convenient interoperability with raw C-code
    -low level numpy buffer manipulation
    -works in semi-JIT-like manner with blender (modules build very quickly and can be loaded at runtime)

    Pyopencl
    -python interface to openCL (mostly for host code)
    -orders of magnitude easier to use than equivalent C libraries

    Flakes (sometime later)
    -superbly convenient data manipulation
    -effortless conversion between numpy and openCL buffers and easy GPU computation (possibly also openGL in the future). This bypasses the blender process entirely (fortunately and unfortunately).



  20. #20
    support for the molecular script, and have a good mesher.. would be nice to have particle emiter interaction too.



Page 1 of 2 12 LastLast

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •