Hi, I’m Saul Clemente. I live in Udine, a little city in Italy and I’m a 3D environment artist. I spend a lot of time studying 3D Graphic and I’m very passionate about my actual job. 8 years ago, I created “RTView” and I started to make only photorealistic renders for different clients — from architecture to design.
1 year ago, with two of my colleagues, I found Virtew. Virtew is a start-up where we make emotional experiences for Virtual Reality. We have produced our first virtual reality indie game RUN OF MYDAN available on Steam.
For my project I wanted to create a 3D model of a small object full of details and with multiple materials starting with photogrammetry. Pomegranate fruit seemed to be more suitable for this type of exercise.
The main objectives of the project were:
- Creating a high poly 3D model with photogrammetry without losing photograph details;
- Modeling a low poly version with a simplified UV Atlas for post-production texturing;
- Creating a texture set to simulate each material of my object following the guidelines of PBR shader;
- Creating one shader to simulate all materials;
- Composing an emotional scene in Unreal Engine.
Photo / Scan Preparations
I used 3D Zephyr PRO software which allowed both the way of photogrammetry automation through presets and the way of the continuous review of every single passage.
Recently I bought a portable photo studio to take pictures of small objects and shoot good photos for photogrammetry. On the internet, I found a product that allows me to obtain a homogeneous light on the whole subject to be photographed, varying the intensity of the single light sources.
The only problem was the shape — it was a spherical all-round object. My risk was losing details depending on how it was placed on the photo set. I fixed it like this.
I’ve set the camera in a way to have the subject always in focus, low ISO and high shutter values to compensate for low light. I’ve used remote control via my smartphone to reduce the micro-blur of the shutter button.
During the preparation, I’ve decided to support my subject with a wooden chopping board that I had in the kitchen. Its texture is very detailed because it has been used to cut salami and cheese!
Before the shooting session, I’ve photographed the ColorChecker to create the color profile through with I’ve developed the photos.
First of all, I’ve photographed the whole pomegranate, then half and finally a quarter.
Here are the numbers of the shots taken for each item:
Pomegranede FULL 54
Pomegranade HALF 88
Pomegranade QUARTER 68
Chopping BOARD 41
I’ve shot the photos in Raw format to be able to correct the vignetting, the distortion of the lens, to reduce the noise of the image, to apply the color profile obtained and save them in Tiff format.
Photogrammetry: 3DF Masquerade
3DF Masquerade software is installed automatically during the first installation of 3D Zephyr and allows you to create masks on the photos to improve alignment. With simple steps, you can cut out an object. The files it generates are automatically read by the Zephyr 3D program. In addition, the Turnable algorithm allows analyzing the created mask and passing it to the next image. I chopped 45 photos in less than 10 min with a few clicks setting up some smudging.
Photogrammetry: 3D Zephir
My goal was getting a 3D realistic mesh. I spent several hours to modify the parameters finding the right result between
- Dense point cloud
- Textured mesh
I’ve imported the images into the Zephyr 3D program: it automatically found the masks generated just before (activate the tick Mask Image) and also the model of camera I’ve used. The program accesses a well-stocked online database of cameras.
When I’ve aligned the photos, I started generating the Dense cloud point in Fas, High and Raw modes to see efficiency.
Happy with the result of the high-density cloud point, I’ve processed it with the fast preset to generate the mesh. The result was almost immediate.
At this point, I’ve tested a new feature of 3D Zephyr — The Mesh Photoconsistent optimization.
Guys from 3Dflow have created an algorithm that allows to further improve the quality of the mesh without altering the shape of the object. In practice, it applies another process of photogrammetry starting from the cloud point generated trying to enhance the details of the surface.
The photos I had taken were full of reflections because of the “roughness” of the pomegranate peel. This is at the same time lucid and rough! The result was very satisfying because the algorithm ignored these types of problems!
I’ve applied some filters in the program that allowed me to create a completely closed mesh (also good for 3D printing) to recreate hidden areas of the subject.
The filters were:
- Fill mesh — Selective: Allows to select the holes to be filled.
- Fill mesh — WaterTight: This performs tests to check that there are no holes in the mesh. In case there are, it will fill them.
When I’ve corrected the mesh, I moved on to creating a texture.
Good! I’ve reached the detail I wanted. From the photos to the 3D mesh with the texture, I have not lost the quality!
I’ve exported the texture in an uncompressed format (.PNG). it’s an excellent starting point for keeping the most information on the texture especially useful for the delight!
Low Poly Mesh & UV Atlas
Photogrammetry generates a detailed hyper mesh and a very complex atlas UV to be managed in post-production.
In addition, the pomegranate’s grains are translucent and very reflective which generates errors during the process of photogrammetry.
I’ve imported the high definition mesh in ZBrush with its relative texture.
And I’ve performed a cleansing of the forms going to sculpt the parts that are not consistent with reality.
I’ve used the basic brushes, following the texture that I have generated in photogrammetry as a reference:
Once the mesh was fixed, I proceeded to the creation of the low poly version.
I’ve divided the mesh into several polygroups to generate a simple management atlas UV in Substance Designer.
I’ve used Substance Designer to create all the support textures necessary for the creation of textures according to the PBR method for Unreal Engine.
First of all, I’ve checked that the transfer of the Diffuse texture from the high-definition mesh to the low-definition mesh works correctly.
Then I’ve proceeded with the other textures:
- Normal Mesh
- Ambient Occlusion form Mesh
After all the processes of correction and union of channels I’ve realized these following textures:
- Base Color
- Normal Map
- Ambient Occlusion
For this project, I’ve created the main material from which I took instances.
I creating only one texture for each asset. The shader is very complex due to the number of materials that I can simulate:
- The outer skin is bright but there are micro reliefs at the same time
- The part of the inner peel is instead opaque and has absorbed some juice
- The membrane is translucent at various points while being opaque
- The grains are translucent and light passes through where the seeds are seen inside
I chose the parameters that would allow me to better set the texture performance based on the framing and lighting present on the scene.
I also have added a small piece of code that allows you to burn the object giving it a less digital and cold impact.
Display of the various channels / buffers.
For the lighting of the scene, I did not do anything special. I’ve used a basic photographic studio lighting technique. First, I’ve placed the camera trying to frame the object according to the rule of third parties. Then I’ve placed the lights on the sides: one on the left with respect to the composition and one on the right near the camera.
Final Render in UE4
The two lights are spotlights, the mobility is set to static for maximum performance. For the background, I’ve used an HDR file (Soft_1Front_2Backs). It can be found in Substance Designer folders. I then applied the HDR texture to the sphere with an emissive material.
The skylight has a cube map with mainly red colors and I’ve set the stationary mobility. To the static mesh in the composition, I’ve set the Light map to a resolution of 1024 pixels. The post process has the LUT daytime with the intensity at 0.15.
The camera I’ve used is a CineCamera to better simulate the settings of a camera and then I’ve added some effect.
To speed up the lightmass rendering I’ve used the Unreal Engine variant with the Lightmass GPU render. I’ve rendered the scene in a few seconds at the UltraHigh quality.
For this project, I haven’t performed any particular optimization, except for creating a UV atlas easy to edit in substance designers.
The textures are generated at 4k to have as much detail as possible, and each texture is exported individually so as not to introduce compression artifacts. The settings for the light mass render are very high due to the Lightmass GPU.
The subject is created with the aim of using it in future high-definition renderings with a strong photorealistic impact. I am usually asked to render over 8k of resolution to be printed in furniture catalogs and require details even at 200% zoom.
I’ve dedicated myself to these exercises during the night because I had more time to experience and to test every single parameter in the various programs and also start designing environments for our future game.