【Niantic Lightship ARDK 3.0 #3】Mastering ARDK 3.0: Revealing the Ingenious Techniques in Demos

From the standpoint of a developer / Creator

Designium
Designium StoryHub
5 min readJan 9, 2024

--

CONTENTS

Introduction

Hi! I am Sakuma Katsuya (@ppengotsu) from Designium.

This is the third article in the series (total of 3) highlighting new features and changes in the latest version of Niantic Lightship ARDK. Two of our engineers actually created the demo using ARDK 3.0. This article will describe the technologies and innovations they use.

Demo :「AR Fishing」

This demo was created by Mao Wu (@rainage) from Designium. It looks like a demo where you can enjoy fishing in AR. Based on ARDK’s semantic segmentation, the ground will become an ocean with cartoon textures. Moreover, the sky seems to be able to transform into the sky in space.

ARDK Features Used in this Demo

Conversations with the Creator

Q: It seems that you are using the semantic segmentation feature in ARDK 3.0. Can you tell us which channel you are using?

I’m using sky and ground channels in this demo. But it is also possible to switch between different channels while the app is running.

👁️‍🗨️ A Heads Up ‼️

Semantic segmentation now makes it easy to find out which channel an object
is in with a single tap on the screen. By using this, I thought it would be
possible to achieve additional effects, such as being able to swing the
fishing rod only on the ground, or making the bait bounce back if thrown
against a building.

If you are interested, please take a look at the official website of ARDK 3.0.

Q: Compared with ARDK 2.X, what improvements are there in semantic segmentation in ARDK 3.0?

Semantic segmentation in ARDK 3.0 is more stable and has more channels.

👁️‍🗨️ A Heads Up ‼️

There are 18 channels. For example, you can use "sand_experimental" to
turn a sandbox in a park into a fountain. Fishing in a small fish pond
in a sandbox can also be fun.

If you are interested, please check out the details in the Niantic Lightship Developer Documentation below.

Q: Are there any other improvements you’ve made or any advice you’d like to share with people who will be using ARDK 3.0?

Yes, I made a simple method which could combine multi-channels into a mask. Then combine it with the AR background as the final result. Here is another test video showing how it works on both sky and ground channels.

🔼 Back to the Top

Demo :「BUG HUNTER AR」

This demo was created by Matt (@mechpil0t) from Designium. It looks like a multiplayer bug-catching game that uses semantic segmentation to adjust the position of insects and plants. AR space sharing between devices appears to be done using a private VPS location.

ARDK Features Used in this Demo

Conversations with the Creator

Q: It seems that you are using semantic segmentation to distinguish between insects and plants. Which channel do insects and plants appear on? It would be helpful if you could answer with a simple explanation in terms of semantic segmentation.

No, I am using semantic segmentation to distinguish between different semantic channels as shown in the link below.

So the app places plants automatically, the scene is mapped using the ARDK mesh manager. The app chooses a random position on the screen to raycast from. If the app gets a hit it checks the semantic segmentation of the point on the screen and will spawn a different prefab depending on the segmentation channel.

For example a different plant will spawn on “natural_ground” than “artificial_ground”. The bugs appear from the plants.

Q: The AR space sharing feature in ARDK 3.0 uses VPS location as a new trigger. Please tell us your experience after actually using the AR space sharing function through VPS. If you have any useful tips, please let us know for future users who will use this feature.

It’s great, very convenient. I think it might be more convenient if I implemented a method that selects the closest wayspot automatically like I did for the lightship challenge.

Q: Are there any other improvements you’ve made or any advice you’d like to share with people who will be using ARDK 3.0?

Make sure they “child” objects they want to sync positions to the shared space (ARLocation).

Official Samples of ARDK 3.0
👁️‍🗨️ A Heads Up ‼️

It seems that it is not created as the parent GameObject in a Scene.
It will be easier to understand if you try it with the official sample,
but please note that ARLocation is created at runtime.

If you need more details, please check out the Niantic Lightship Developer Documentation below.

🔼 Back to the Top

Conclusion

Thank you for reading this article. ARDK has also released 3 versions, and it feels more user-friendly. In particular, the official documentation website has also become easier to read. I think the explanation of semantic segmentation introduced above is also easy to understand. Since the threshold for settings has been lowered, please take this opportunity to try it out. 😉

EDITORIAL NOTE

I am Mary Chin (Chi-Ping Chin), the writer and designer of the PR team at Designium. Thank you for reading the ARDK 3.0 series of articles. Please continue to look forward to the development of ARDK 3.X in 2024. Let’s keep working on creating innovative XR applications using Niantic offical Lightship ARDK!✨ If you have any ideas for cooperation, please feel free to contact us. 😊

CONTACT FORM

--

--

Designium
Designium StoryHub

Award-winning XR Studio specializes in developing VPS experiences. 日本東京科技x設計公司。 基於「讓人們感到開心」的理念,開發結合新科技的數位內容體驗。包含「結合地方文化」、「提供技術服務」、「協助有趣的創作」等。 如需合作相談,請隨時與我們聯繫。