How to turn paintings into 3D models with Phygital+, MidJourney and DALL-E

Imagine a usual art exhibition. Can we turn this possibly boring experience into something mindblowing using AI? Read to find out how.

Phygitalism Inc
PHYGITAL
Published in
5 min readOct 24, 2022

--

AI changes everything, and nowadays it’s more accessible than ever. How can artists and designers benefit from new AI tools such as MidJourney in their creative pipelines? Let’s show it on the example of ArtLife.

Exhibition overview

This year Artlife, an offline festival of contemporary art, featured 170 paintings (which is kinda a lot to comprehend at once). To make it more immersive, multimedia artists used AI to turn 20 paintings into interactive 3D experiences. Using Phygital+ and other text2image tools, they implemented them on 2 stages: concept art and 3D modelling with AI-generated textures.

Phygital+ UI and workfow demo

Let’s take a look at 3 paintings and how they were transformed into phygital 3D sculptures.

Walls in flowers by Valerie Titova and Svetlana Soloveva

Original painting “Walls in flowers” by Max Tennet | 3D Sculpture”Phenomenon” by Sveta Soloveva and Valerie Titova

Sveta and Valerie were impressed by the wild wallpaper pattern, eye-catching colors, and a bright feminine figure of the original painting. They decided to mythologize the composition, presenting the lady of the apartment as a fantasy siren in an exotic garden.

1.Concept art. Artists mixed two concepts — realistic “unfinished” work by Tennett and fantasy images created with image2image models in Phygital+ (Stable Diffusion and Disco Diffusion)

Wallpaper concept by img2ing Disco Diffusion in Phygital+
Character concepts by img2ing Disco Diffusion and Stable Diffusion in Phygital+

2. 2D-to-3D. The results served as concept art for 3D development. Focusing on the character generation, the artists made the model in Daz3d, connecting the rig and animation through mixamo. Then, they went to Blender and put together the full look: a kokoshnik, a top and a skirt, referencing generated outcomes, and animated it.

Various AI generated refereces influence-final work

Bleeding Heart by Lisa Boiur

Original painting “Heart” by Vlad Urbahanov | 3D sculpture“Bleeding Heart” by Lisa Boiur

Lisa’s 3D sculpture is based on the painting “Heart” and transforms one expressive portrait into a sequence of woman close ups with various ages and moods. According to the artist, that is the kind of emotional range relevant to the calm young face from the painting.

1. Concept art. Lisa tried to adapt the key qualities of the painting for the 3D space: inspired by the fluid original patterns, she made the sculpture elements in the spiral shape. The artist used inpainting to remove the character from the composition and created multiple portraits in Midjourney.

Text2img portraits by MidJourney

2. 2D-to-3D: Lisa used these portraits as textures for the final work. And with inpainting feature in DALL-E 2 she was able to get a bigger image of the pattern and use it as a texture on the 3D spiral shape.

DALL-E 2 inpainting for removing the main figure on the painting

Here’s how the final 3D model looked

Textures in the final 3D sculpture

Tempting the virtuous by Denis Rossiev

Original painting “Tempting the virtous” by Darya Dolgacheva | 3D sculpture ”Attempting to Naturalness” by Denis Rossiev

The break between the human and nature is the core of the painting: the character points to the side, and we see the detached expression of the young man’s face and the vivid butterfly color, alien to the rest of the “human” content of the painting.

1.Concept art. In his work, Denis develops this motif of going beyond the frame, slightly given by the author of the painting. To do this, he turns to the biopunk aesthetic of human-nature conflict resolution.

First, the creator upscaled the picture, then prepared a blank area to generate the content around it and made different variations of his idea with DALL-E2 and MidJourney outpainting.

Inpainting process and outcomes in DALL-E2 and Midjourney

2. 2D-to-3D. Then, Denis generated a 3D body in Phygital+ , transferred its texture via projection to 3D, and animated it in Blender.

3d models: final version and intermediate stages

The result, as well as other works, was uploaded to the ARhead platform for viewing in AR. Try them out in AR by scanning QR codes on our Behance project page

Conclusions

Our 5 year Artlife partnership and recent collaboration with 20 artsits showed the importance of innovative solutions for the CG production at all the development stages: pre-production, production and post-production. AI saves time with clarifying the vision, accelerating concept art thumbnails, preparation of the high quality textures and 3d modeling.

We encourage self-creative exploration in our community and always try to educate creative professionals about tech possibilities. If you want to dive deeper into creative AI possibilities, check our AI library materials, where we collect, classify and describe the newest AI tools and generative models there.

Thank you for following us on this innovative path🎨🦾

The authors: Darya Zhurnakova and Daria Wind🪶

--

--

Phygitalism Inc
PHYGITAL

An international tech company developing Phygital+, a web-based AI product for creators