Free 3D Photogrammetry with MacOS Monterey on M1 + Intel chips

wow that is a few thanks I will have a look.

was planning to do a few tests too but I think this will hep me take some decisions :slight_smile:

Another interesting aspect of high MP cameras is the digital zoom one could have.

Macro cameras are traditionally on android phone junk like 5 to 2 MP images lol

With a 108MP main camera you can crop / digitally zoom and this way photograph a small object but fill the image with it!

Small details tend to be very tricky, I did try a small pine tree in a pot with some flowers, that did not work to well. That was with only 16 MP however and 12 MP.

Just saw this, wow
https://www.guru3d.com/news-story/samsung-unveils-isocell-image-sensor-with-industry’s-smallest-56μm-pixel-(200mp).html

For that I assume I need a macro lenses

Kinda funny how Samsung announces the sensors yet does not use them.

Seems Sony is still far ahead.

I decided to buy this https://blendermarket.com/products/photogrammetry-course

He makes some interesting points about what camera to chose and some I expected.

If I want to make it short he recommends decent resolution and good “spacial resolution” (which is optics, iso, …) and shooting raw.

oh yeah the better the sensor and optics the better the images the software can work with.

my main reason for the tests I am doing is that students don’t have access too that.

I am actually after years of teaching and flooding youtube with my lecture videos won’t to step away from it and focus more on project based videos for the belndermarket.

The first topic will be about photogrammetry but with generic equipment digitizing objects and spaces for industrial game asset and interior/archietcture design modeling.

So drone photogrammetry, LiDAR assisten photogrammetry etc will be part of it.

My current only concern is how long will just a lecture valid / updated with software changing so fast.

I think we reached the eclipse of Apples LiDAR - there isn’t any new better sensor coming so it is save to use it now.

or I go printed book with video tutorials included - this is easy to update

yes true good point and everyone has a mobile phone. I am still crossing my fingers for Apple to finally up the resolution of the camera but I have no real hope that they will change or update Lidar.

If you come across a good solution to use 360 photos other then Metashape, which seems interesting from what I have seen but I don’t want to buy the professional version, please let me know :wink:

I would not call Zephyr a solution from the results I have seen.

I am curious about it as I will probably get a 360 camera to make HDRIs.

I never realized before today that you could capture a room with photogrammetry and make it have hdri textures that then will light the objects in it. Not sure however if it needs to be done with Reality capture (perhaps they will explain at a later point).

Yea actually you could do a quick scan - even a lame LiDAR scan - for reflecting the env this will be fine.

Actually good idea didn’t even think about it. Better than rebuilding or photoshopping a rendering onto a photo

But for IBL i am not sure if photogrammetry can produce the hdri texture you need

Yes good idea too.

The way I understood the quick explanation is that instead of taking normal photos for the photogrammetry you take a bracketed exposure, turn those into HDRI in exr format and then do the photogrammetry.

What I like about your idea is if I would do that and create and HDRI with a 360 camera I could have a similar effect, as long as as background does not need to be perfect :thinking:

I am not sure which one is more difficult to do. I might try both however.

Not sure if Photocatch or Metashape support EXR as input however.