The Netherlands is well known for its tulip flower fields. In most cases the fields are composed of long plots in one intense color. Next to that another intense color, and another and another. It’s very impressive to experience the fields from a plane or a moving train. It’s the land of Piet Mondrian (the famous sock designer :-).
Some passionate gardeners put flower bulbs in the ground in order to surprise their loved ones with a colorful message that pops up in the spring. This brought me to the idea to make a drawing-with-flowers-in-AR-project.
Some months ago, my colleague Michael Tjia and I sent AR flowers to the Torch staff. Those were to show our enthusiasm for their incredible app updates and open attitude.
Each flower has the same set of ‘genes’ and a similar behavior. The flowers grow if you approach them. You can visually set this behavior in just a few clicks. The trigger for the flower response is called a proximity trigger. Here is a link to an instruction video about adding such an interaction:
Then, there was a mind blowing video by Torch genius Kami Karras who gave the flower idea another boost. Her path tracker worked with hidden proximity sensors. So we did not need to keep our phone low (in the proximity of the specific flower) in order to magically stimulate the flowers to grow.
Based on that idea we adapted our growing flowers a bit. We can share the project with a Torch link, so you can try it with an AR capable iOS iPhone or iPad:
More recently, there were Kami’s flowers for Mother’s Day:
Mother's Day Facebook Camera Effect Is a Gift for AR Creators
by Nathan Bowser | May 10, 2019 | We built an interactive, customizable augmented reality Mother's Day app for the…
All these delightful moments gave enough input to make a next step in Flower Messaging Innovations.
Here is the sketch of the idea to combine several trigger events in Torch to make your own pixel-art with a flower in each pixel position:
Each ‘pixel’ consists of the following parts:
- A completely transparent png image, positioned as a floor tile under the flowers. This image will become the sensor for a gaze-at and gaze-away trigger. When the centre of the screen ‘gazes’ at the image, the visibility of another image with some green leaves will be switched ‘on’. When the screen ‘gazes away’ the leaves image will be made invisible.
- The same completely transparent png also functions as a touch sensor that, when touched, lets a red flower appear.
- An invisible box will work as a proximity sensor that will trigger a yellow flower to become visible.
The flowers were easily found in the model repository ‘poly’ that is integrated in the Torch interface.
When all interactions were made, the pixel was tested and then it was copied in a field of 8x8 flowers. Here is an indoor test:
I am still fine tuning this project and will need to test it outside. Maybe I was a bit too ambitious as the interface combines three types of interactions. Now I’m also testing with only red tulips. Often, less is more.
When I’m totally satisfied and every flower is optimized for your appeal, I’ll share a proper link.
In the mean time I would really like to encourage you to try Torch yourself. It is a pleasurable way to challenge your creativity and you will find out how easy it is. It’s not a steep learning curve, but it quickly heightens your level of spatial, contextual, communicative, ergonomic and systematic thinking. That’s a whole bouquet of skills, available for you...
Here are the previous two AR challenges :