top of page

Metaverse / Volumetric  Shooting for Vimmerse.net

In February, I finally got my new Azure Sensors, and I now have a set of 6 Azure Sensors. 

 

As well I updated my two Volumetric Workstations (one in Germany Essen and the other one in Poland Sopot) 

 

Vimmerse asked me to shoot with their DepthKit Studio Version and my updated hardware ...

 

so I will show some tiny bits and pieces from that shooting process of .. setting, matching, recording 

as well I will later upload some additional Unity test results 

The shoot for Vimmerse showed another time matching and alignment is significant for Volumetric Recording

As dancers, I asked another time the dancers I had already worked with in previous volumetric shootings 

Agnieszka Pulwak Agnieszka Pluwak | Facebook

Konrad Prus  Konrad Prus | Facebook 

 

As shooting location, we choose the Goyki3 rehearsal stage  Goyki 3 Art Inkubator | Facebook

 

 

 

 

 

The aim was to shoot with a special half-circle approach of 120 to 140 degrees with a specific distance to the back wall, as the room

and the background was of interest as well.

 

We did several takes with different speeds and with different songs.

So, in the central "Top" clip, you see the half-circle results !!!

(Have a look at the back wall. The cleanness of their codec at the wall is something special ... what preserves a memory-like feeling)

 

But I also added to the shooting schedule an additional shooting to compare EFEVE and DepthKit, so I shot after DepthKit as

well the same setup with EFEVE Studio, and then I changed the position and repeated the alignment session to get an "all sides" 

The full circle setup was again with DepthKit Studio and EFEVE Studio.

An exciting part was that Vimmerse and I tested and played with the recommended highest asset export method.

 

PNG stream of RGB + Depth // then trans-code the one-row PNG into a two-row PNG // then transcoding the PNG stream

by FFmpeg command line code into  mp4 asset, we tested  as well the PNG sequence asset approach to creating an

PLY + Texture Cord Asset

The assets are usable in different ways, for example, to implement in a VR build, AR build, or even useable for a streaming use-case.

 

Vimmerse, off-course, has processed the material for the needs of their codec and platform.  

I also created an engine using a photogrammetry model (with around 500 photos and MetaShape) as an add-on, which you will see when I show some in-engine screenshots ...

- thin cloud

- dense cloud

- mesh 

- textures 

Thanks again to Vimmerse for their trust and the task of realizing the Volumetric shooting for them. 

bottom of page