Virtual Market
Published in

Virtual Market

Creating Companions for our
Virtual World

A general breakdown of the No2 Companion System, its development, usage, and future enhancement plans.

Hi! I’m Vowgan. I develop interactive systems and experiences for Virtual Market inside of VRChat, a social VR platform. I create tools, assets, and tutorials that many people use daily. At Hikky, my focus has been bringing ideas to life, such as the “No2” interactive companion system.

What is Virtual Market?

Virtual Market (or Vket for short) is an international sales event held inside virtual reality and run by Hikky out of Japan. This companion system was first introduced during Vket 6 in August of 2021, and was most recently used during Vket 2021, our December event the same year. This system was used inside Ennichi for Vket 6 and Akihabara in Vket 2021. Ennichi was an environment based on traditional Japanese festivals, while Akihabara was based on the real-like location in Tokyo, Japan, though with some fantasy and sci-fi twists respectively.

Who and What is No2?

Originally, my first major task for Vket 6 was to create a sort of “dating simulator” experience, which grew and developed into the No2 Companion System. Originally, No2 (yes that is the character’s actual name) was used for the world of Ennichi during Vket 6. This dating-sim character would be used to both enhance the user experience as well as tell the user about things they walked past. They would even go on to be used in later events such as Vket 2021, with plans to be continually used and improved upon.

Of course, in order to properly accomplish this, we had a laundry list of things to get through.

  • No2 needed to both follow and walk alongside the user in a natural and comfortable manner.
  • Dialogue boxes would be needed to let the player interact with No2 and be given information. These would need to allow for answering questions, as well as making decisions based on those answers, such as “follow the user” or “don’t follow the user.”
  • No2 needs to interact with specific sections of the world, such as playable games and brilliant set pieces. These interactions need to feel natural and flowing, while also able to grab the attention of the user.
  • Lastly, they need to be able to teleport back to the user if they get too far away. In VRChat, the user can respawn to the beginning of the world at any time, so No2 would need a way to catch up with them that wasn’t a two-kilometer hike.


Let’s start with how we move No2 around the world.

Since we have users in dominantly English and Japanese speaking countries, I needed a good way to choose which language to use right off the bat. No2 does this simply by looking at what time zone the user’s computer is in, and if they’re in Japan Standard Time, use Japanese. If not, use English. The user can of course change languages at any time with a button next to the dialogue itself.

Invisible boxes are used to tell No2 when to respond to certain events, either when the user steps into that area, or when No2 themself does. Using these, I started the experience with No2 asking the user if they wanted No2 to come with them, allowing the user to opt out of the experience.

When No2 follows alongside the user, they will run ahead at times to point out certain objects and events. Sometimes these would be a simple comment, while other times it would lead into an interactive game and have lots of dialogue and animations No2 would go through.


Ennichi was the first attempt at this system, so while the end product was successful and we fortunately got many parts right on the first try, it wasn’t without its learning experiences, and almost all of them were related to development time or No2’s movement.

Originally, we wanted No2 to stay in the user’s view at all times. While this worked out well for those users who were on their computers, users who were in VR felt as though No2 was in the way constantly for them. “Staying in view” on desktop often translated to “getting in the way” in VR. Later on this would be changed to have them running alongside the user, rather than trying to predict where the user was going and run ahead of them.

Another issue we ran into was with dialogue boxes. They worked perfectly, but I was not aware that Vket already used dialogue boxes similar to mine in other places which users could not interact with. This ended up with many users not knowing you could interact with it at all, entirely missing the experience. To alleviate this, the dialogue was changed to auto-progress to the next line and have clear, visual indications when the system wanted direct user input.

Specific events originally needed to be scripted individually every time. At the time of Vket 6 in Ennichi, there was no decent system for directing how an interaction should work and control No2. Creating these individual systems became a MASSIVE time sink. For Vket 2021, I finished creating a modular “Director” system to streamline these creations, which I’ll explain further down.

These specific interactions also gave way to another issue: animations. Each interaction needed an animation to use, which we began creating on a per-need basis. Knowing what animation we would need required the environment to be finished, the character model to be finalized, and the written script to be completed. This took a lot of waiting, then a large rush at the end to get them created and changed in whatever ways necessary. Maron did a wonderful job at creating these animations, but we would need to instead request more general animations that could be reused in multiple places. By the time we did this for Vket 2021, the process of designing and creating interactions with animations was dropped to mere minutes with a pre-existing animation list and the director system.

The last major time sink was the character model. Throughout development, the actual model we were using for No2 changed quite a bit, and every time I would have to rebuild No2 using the new model. This was later automated with a system where I told it what model to use and it automatically created the entire system.

Custom Development Tools

To speed up the development of No2, I ended up creating two distinct tools for helping the process move along. The first was the “Npc Creation Window.” Here, you place the model file in the top line, select any changes you want, and press the “Generate NPC” button. This turned literally days worth of work into just a couple seconds.

The next tool I needed to create was the NPC Director. This would replace all previous scripts that needed to be made. Basically, this took events, dialogue, interactions, animations, everything we needed to tell No2 to do and put them in one clean place. Various dropdowns tell you which action you’re trying to make and in what order, how long it will take, and what sort of dialogue No2 would have. It also can transition to other directors for branching dialogue. The only downside to this system is how long adding new features takes. However, we’re relatively feature complete, so there aren’t many things I need to add.

Here’s a reference of two different directors used by No2. On the left, I have one where they iterate through different directions. On the right is one direction, but showing off the dialogue system and spaces for multiple answers. It has spaces to type out both English and Japanese, the option to enable a two-choice answer, and responses to each individual answer the user would make.

The director system sped up interaction development immensely. Events and such that would take a day previously to create now took only five minutes on average. Since these interactions were so quick to produce, I was able to add one for each booth inside of the Akihabara for Vket 2021.


No2 is a very complex system that grows and develops with every iteration. However, they’re quite ironed by now, and will continue to be used in future events and have more interactions and conversations to be excited for! Making as much of the system modular as possible has been a life saver, so if anything would be added in the future, it would likely be adding more customization to the already existing systems. A growing, evolving system seems pretty fitting for a character like No2 though, don’t you think?

The ultimate goal is to have anyone on the team be able to open up the project and make changes or additions as they see fit, without requiring myself or another programmer. In my belief, putting these types of creative tools in the hands of our artists and writers is crucial to having as expressive and impressive an event as possible.

What’s next?

We’ll be having Music Vket 4 and Vket2022 this summer, so get excited for what sorts of systems and interactions you’ll come across during those!

Music Vket 4 (or #MusV 4) will be taking place between June 18th and 26th, while the next mainline Vket event (#Vket 2022 Summer) will be the second half of August, running from the 13th to the 28th. We look forward to seeing you at these events, and I personally hope you have fun with No2 in whatever world they may be in next!

Feel free to check out the pages for both events now!

Music Vket 4’s total registered exhibitors have far surpassed our expectation, (resulting in a spaces expansion!), and celebration event announcements are ramping up right now so watch out for those! You can find the details here:

For Vket 2022 Summer, exhibitor sign-ups begin April 22, catch all the information here on our website!



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store