Introduce “Haru” : A Mixed Reality Pet That Lives in Your World
Have you ever taken off a VR headset and felt like the magic just… vanished? Like you were part of something beautiful for a moment, and then suddenly it was gone, leaving behind a weird emptiness?
Hi, I’m Bini Park who love immersive tech as the joy of stepping into entire worlds built from imagination. But every time I step out of those worlds, I’m left with a lingering sense of disconnect. It’s like waking up from a dream I can’t quite remember, even though I didn’t want it to end.
One of the moments that hit me hardest was while playing Peridot by Niantic. I grew so attached to this adorable dragon pet but the second I took off the headset, the whole experience dissolved. That pet I loved felt like it had never existed. That disconnect? It stuck with me.
And it got me wondering…..
what if a virtual companion didn’t disappear the moment you left the headset? What if it could follow you through your real life, living across screens, lenses, and moments?
That’s where Haru was born.
He came from a simple question: what if mixed reality wasn’t just a one-time spectacle, but something woven into our daily habits?
I wanted to give people a reason to come back to their headsets, not because they had to, but because something real was waiting for them. Something emotional, something consistent. Something that didn’t disappear the moment the headset came off.
That “something” became Haru.
So, who exactly is Haru?
Imagine this: a tiny alien creature crash-lands on Earth. In the chaos of impact, he loses all of his memories. They don’t just vanish but they shatter into invisible fragments that scatter themselves across the real world, hidden inside everyday things that hold emotional weight. A tree, a cozy blanket, a childhood toy. Haru doesn’t remember who he is or where he came from, but he knows one thing: those memories are still out there. And he needs your help to find them.
But Haru doesn’t live only inside your headset. He exists across both the real and virtual worlds. You connect with him through your phone, through your camera, through your surroundings. One morning, you might wake up to a soft message from Haru: “I can’t sleep… I remember something cozy that used to help me rest. Could you find something cozy for me today?”
So you look around and take a picture of your favorite pillow, the one that feels warm and safe. You scan it through the app, and it detects a memory ball fragment hidden in the image.
Later that day, you slip on your headset and enter Haru’s quiet, foggy world. You give him the memory you found, and just like that, he begins to remember. A “memory object” appears, a glowing little item based on your photo. You can move it, resize it, and place it wherever you want, slowly transforming Haru’s world into something full of light and life.
Bit by bit, you’re not just helping Haru recover what he lost. You’re helping him build a home. One that’s shaped by your own memories, your own story.
Why I Made This
I kept noticing the same pattern in the MR industry: exciting ideas, impressive tech demos, and huge investments, yet very few apps or games actually stick in people’s lives. According to a 2023 report by ARtillery Intelligence, fewer than 15% of mixed reality headset owners use their devices daily. That number drops even further after the first few weeks of ownership.
Meta, the biggest player in this space, knows this too. Despite selling nearly 20 million Quest headsets, the company has struggled with retention. As reported by Road to VR, during an internal presentation, Meta’s VP of VR, Mark Rabkin, noted that many newer users simply “just don’t have a reason to come back to the headset every day.” He went on to say,
“We need to be better at growth and retention and resurrection… We need to be better at socializing and actually make those things more reliable, more intuitive so people can count on it.”
In response, Meta launched the Lifestyle App Accelerator in 2024, a program to fund developers creating so-called “everyday” apps like note-taking tools, whiteboards, and fitness trackers. But here’s the problem: people aren’t going to put on a headset to check a to-do list. No matter how functional a VR app is, the friction of using a headset makes it unrealistic for routine tasks.
And that’s where I think the industry has it a bit backward.
Instead of trying to make headsets mimic our smartphones, what if we leaned into what makes mixed reality special? Not productivity, but presence. Not efficiency, but emotion.
What if MR could make you feel something? What if it gave you a reason to come back, not out of obligation, but out of curiosity or care?
That was the spark for Haru. A companion that bridges those two worlds. A character who remembers, even when you disconnect. A tiny reminder to pause, look around, and turn the ordinary moments of your day into something meaningful.
Research & Playtests
Before building anything, I wanted to test a core assumption: could a hybrid mobile and mixed reality experience become part of someone’s daily habit? To find out, I ran a survey with over 150 people of varying ages and backgrounds. I asked them if they’d be more likely to use a mixed reality headset regularly if the experience were tied to something they already use daily, like their smartphone.
The result? Over 70% said yes. If their mobile interactions extended into a meaningful MR experience, they’d be much more inclined to return to their headset consistently. That was huge. It validated the emotional design approach I was taking with Haru, one that prioritizes lightweight, habitual interactions over one-off spectacle.
Interestingly, most participants reported using their phones for 4 to 6 hours a day, mostly for social media, messaging, or taking photos. These are the exact kinds of casual behaviors that could blend perfectly with Haru’s daily memory prompts. Many had played virtual pet games like Pokémon GO, Pou, or Tamagotchi. They talked about the joy of watching their pet grow, the comfort in daily routines, and the fun of forming tiny emotional bonds with something on-screen. Some even said how magical it would be to see that pet “come to life” in their physical room through MR, especially if it remembered past moments.
To go deeper, I set up a Wizard of Oz playtest with 12 friends. Every morning, I texted them a simple prompt like
“Send me something blue” or “Find something cozy.”
They responded with pictures. But what surprised me was that they didn’t just send the image. They often told me why they chose it. A chipped blue mug reminds someone of their grandmother. A patch of sunlight on the floor reminded another of their childhood bedroom. The activity became less like a task and more like a small act of reflection. People began to look forward to the messages.
That’s when I realized: this wasn’t just about helping Haru recover his memory. It was about the player remembering too. The emotional loop was happening on both sides.
With this insight, I set a clear goal: to validate whether mobile-first interaction could act as a daily hook that re-engages users with MR. My success metric was straightforward. If at least 60% of users said they’d be more likely to return to MR with mobile integrated, I’d pursue this design path. Not only did I hit that bar, I surpassed it.
That gave me the confidence to keep building.
Designing Haru (and the World Around Him)
From the very beginning, I knew Haru had to be lovable. I didn’t want players to care for him just because the game told them to. I wanted them to care because he felt emotionally real. That’s why I chose a hamster-like form: small, round, a little clumsy, and big eye and feet. Hamsters are universally adored and often live in little habitats that people love to decorate, which felt like the perfect fit.
The design of Haru came out pretty quickly as I had a clear image in mind for Haru’s look. He has big eyes, stubby limbs, and a soft, wiggly body. My favorite detail is his ears. They are expressive and subtly show his emotional state. When he’s excited, he jumps up and his ears perk. It’s small things like that that make him feel alive.
Visually, I leaned into a toon-shaded look: soft outlines, gentle pastels, minimal shaders. I didn’t want Haru to feel realistic. I wanted him to feel charming. Something that would instantly make you smile.
But Haru is just one part of the world. Around him, I began sketching what the full experience would look like, from the mobile app interface to the headset environment. I drew storyboard panels of what happens when Haru sends you a message, when you snap a photo, when the memory turns into something new. I sketched what the room might look like before and after you’ve fed him three memories. I mapped out the transitions between your physical world and Haru’s virtual one.
The mobile side was all about casual, daily interactions. Quick taps, camera captures, soft prompts that feel more like gentle nudges than gameplay objectives. The headset side, in contrast, was meant to feel real and interactive as by interacting with Haru will make our room slowly come to life.
User Design
I designed Haru with three main target audiences in mind:
- Lapsed Quest Users — These are people who already own a Quest headset but rarely use it anymore. Many of them have lost interest in traditional VR games or find them too time-consuming. I wanted Haru to be a gentle way back in. Something light and emotionally engaging that makes daily use feel natural rather than demanding.
- People Who Love Pets (but Not the Work) — Some players love the idea of caring for a pet but don’t want the full-time responsibility of owning one. Haru provides a playful alternative: a character you can check in with, care for, and bond with, without the stress of real-world obligations.
- Companies Exploring Mixed Reality Adoption — Finally, this project speaks to organizations like Meta and others trying to solve the real-world challenge of low MR adoption and retention. Haru is a case study in how cross-platform emotional design can encourage daily use and offer a more meaningful reason to return to mixed reality.
Mobile Design
The mobile experience was designed to feel lightweight and habitual that something users could engage with quickly each day without friction. Mobile design went through a lot of iteration. Overall, I used Figma to wireframe the features and map out the user flow for the mobile experience.
- Version 1 had all the bells and whistles: a calendar, settings menu, time of day indicator, and a conversation directly on the home screen. Haru would hide inside his little house. It was cute… but also way too much.
- Version 2 created a separate tab just for talking to Haru, while keeping some of the features from v1. It was more organized, but still felt bloated. Through playtests with friends, I realized most of the features weren’t needed.
- Version 3 was a reset. I stripped it all down to focus on what mattered: sending images and receiving memory balls. Once I saw that this technical flow worked, I returned to the narrative. That’s when the current story clicked. Haru had lost his memories, and you were helping him find them through real-world photos.
The final mobile UX focused on three key goals:
- Help the user understand why they’re helping Haru by introducing the narrative clearly.
- Make it easy to receive Haru’s prompts and send back memory ball fragments.
- Show the current status, whether it’s time to talk to Haru again or switch into headset mode and interact directly.
Headset Design
Designing the headset side was all about reducing friction. I wanted it to feel intuitive even for first-time VR users. I also wanted to avoid the uncanny valley, especially for players who had never experienced a virtual pet before. That’s why I chose to build using Meta’s MR Utility Kit (MRUK), which lets me incorporate real-world room data to make the experience feel grounded. The more personal the space felt, the less “fake” it seemed.
Room scanning was step one. From there, customizing and placing memory objects became the core interaction. But hand gesture design? That was a journey.
In early versions, I went a little wild:
- Palm up to call Haru
- Snap to toggle a lamp
- Point to direct him
- Hover to lock/unlock objects
- Double palm to spawn Haru’s house
- Pet Haru’s forehead, his belly, and more…
It was too much.
So I trimmed it down, keeping only what actually added value:
- Both palms up to call Haru
- Letting go of an object drops it to the surface naturally
- Objects lock when you walk away and unlock when you get close
- You can pet Haru by gently placing your hand near his forehead
I also removed full movement control. Instead of letting players drag Haru around, I gave him light AI behavior so he could make decisions on his own, like where to sit, what to explore, and how to react. This made him feel more alive and less like a toy being programmed.
I learned a lot during this process:
- VR passthrough occlusion is possible, and it makes a big difference.
- Hand gesture interaction maps are essential for clarity and simplicity
- Light AI behavior can often feel more magical than full player control
The goal wasn’t to build a complicated simulator. It was to create a world that gently unfolds around a character you care about. A world you shape instead of through menus or buttons, but through memories, movement, and small moments.
Tech Development
This was, by far, the most challenging part of the entire project.
The idea of connecting a mobile app to a VR headset and syncing memory data across both in real time sounded simple. But in practice? It was a mess, especially since I had zero experience building mobile apps in Unity when I started.
Like most developers hitting a wall, I turned to Reddit. I asked how to connect Unity apps across mobile and VR. The silence was loud. Turns out… not many people had done this exact thing before.
So I started small. I created two separate Unity projects, one for the headset and one for mobile. That was the first step, and honestly, it helped me wrap my head around what would eventually become a much more complicated pipeline.
Around that time, I connected with Ruifend Xu, a developer who had built a similar mobile-to-VR prototype and even won a Meta Hackathon with it. He introduced me to Photon Fusion 2, a networking SDK built for Unity that supports multiplayer, cross-platform development. It turned out to be a huge unlock.
Fusion 2 helped me get basic communication between devices working. I followed the tutorials, set up rooms, and began sending data from mobile to headset. It felt magical when it first worked.
But it wasn’t stable. I started in Host Mode, which made sense at first because one device acts as the “authority” and manages the session. But then I hit a huge wall: if the headset went to sleep or disconnected, the entire session would collapse. Not ideal for casual daily interactions. The host would vanish, and everything with it.
That’s when I switched to Shared Mode, and everything started to fall into place.
Why Shared Mode Worked Better
Shared Mode gave me a more flexible structure. Instead of assigning one device as the host, any device could connect and sync. This solved multiple problems at once:
- Mobile-first flow: The mobile phone often triggers the session by taking a photo or sending data. Shared Mode meant the headset didn’t need to be online at that exact moment.
- Headset crashes or restarts: In Host Mode, if the headset died, the whole room would collapse. Shared Mode kept it alive.
- Simpler ownership: I didn’t have to track or transfer authority across devices for things like memory balls or food items. Both platforms could interact with synced content equally
Eventually, I merged both mobile and headset logic into one Unity project for easier management. That created its own headaches, like Samsung Gear VR auto-launching on older phones. Once I figured out it was just an issue with older models, I switched to a Samsung Ultra 20 and the problem disappeared.
Once I had the overall architecture working, with Fusion set up and mobile and headset communicating properly, the next big task was syncing the meaningful stuff: the memories.
This is the heart of the game. Every time a player takes a photo on their phone, they’re helping Haru remember something. That moment gets processed into what I call a memory ball, and this object needs to travel from mobile to headset, even if the headset isn’t online at the time.
To handle this, I built a system called GlobalFoodSync to persist memory items between platforms.
How Memory Syncing Works (with Code)
The goal was to make sure that even if the player collects memory fragments while away from their headset, everything gets stored, queued, and delivered when they return.
1. Persisting Items on the Mobile Side
On mobile, when the player sends an image that contains a memory, the app processes it and adds it to a shared queue:
// Mobile-side logic
FoodQueue.Add(memoryData); // memoryData includes image tag, emotion, timestamp, etc.
This queue lives on the network and stays alive, even if the headset hasn’t connected yet.
2. Syncing Items When VR Joins
Later, when the VR headset boots up and joins the session, it triggers this function to fetch the stored data:
// Headset-side logic
public void RegisterVR(PlayerRef player) {
foreach (var food in FoodQueue) {
SpawnFoodForPlayer(player, food);
}
}
So even if the headset was off when the photo was taken, Haru still receives the memory once the player comes back.
This design allowed me to keep both platforms loosely coupled — no pressure to be online at the same time, and no “lost” memories. It made the interaction feel natural: take a picture on the go, and when you return to Haru later, he remembers it.
Under the Hood: How It Works
Let’s break down a few other parts of the system that made all of this possible.
Shared Codebase Across Platforms
I wanted to maintain one Unity project for both mobile and VR, but obviously they don’t behave the same. So I used platform directives to separate logic when needed:
#if UNITY_ANDROID && !UNITY_EDITOR
// Mobile-specific behavior
#elif UNITY_OPENXR
// Headset-specific behavior
#endif
This let me conditionally run features without duplicating everything across scenes or builds.
Event-Driven Architecture
To coordinate interaction across different systems (like triggering memory spawn from mobile input), I built a simple EventBus
. This helped keep things modular and easy to expand later.
// On mobile: after sending image
EventBus.OnMemorySubmitted?.Invoke();
// On headset: listening for that memory event
EventBus.OnMemorySubmitted += () => {
SpawnMemoryBall();
};
That way, mobile actions could trigger headset changes seamlessly, even though they live on different devices.
Syncing Memory Objects with Fusion
All the synced items, such as memory balls, food, or interactions, were spawned using Fusion’s multiplayer system. Here’s how I created a memory object and passed data across devices:
var memoryObject = Runner.Spawn(memoryBallPrefab, spawnPosition, Quaternion.identity);
memoryObject.GetComponent<MemoryBall>().Initialize(memoryData);
Fusion automatically keeps this synced between mobile and VR. I just had to make sure the prefab and its serialized data were set up correctly.
Anchors and Real-World Placement
I also used Meta’s MR Utility Kit (MRUK) to ground Haru’s world in the player’s physical space. The idea was simple: memory objects shouldn’t float randomly. They should be placed on real tables, beds, or floors.
var anchors = MRUK.Instance.GetAllAnchors();
foreach (var anchor in anchors) {
if (anchor.label == "table") {
Instantiate(memoryObject, anchor.transform.position, Quaternion.identity);
}
}
This helped Haru feel like he truly lived in your room, not in a separate digital void.
Getting this all to work from syncing memory data, anchoring it in the real world, and to making Haru feel present wasn’t just a technical milestone. It was my journey. It meant the game could finally deliver on its promise: a companion who remembers, who grows, who responds to your world whether or not you’re wearing a headset. There were dozens of broken builds, late-night debugging sessions, and moments where I thought maybe this whole cross-platform thing was too ambitious. But little by little, it came together. And in the end, Haru felt real.
Not because of the shaders or the animations but because the tech finally got out of the way, and let the relationship shine through.
Takeaways & What’s Next
Looking back, Haru started as a tiny idea while I was listening to Alec Benjamin’s Older on a flight from Los Angeles to New York. It was a memory, really. That strange disconnection I felt every time I left a virtual world. That question of whether a digital companion could feel real enough to follow me back into daily life.
I didn’t start this project with all the answers. I didn’t even know how to build a mobile. But what I did have was a hunch: that MR didn’t need to compete with smartphones but it could partner with them. That maybe the best way to get people back into headsets wasn’t through productivity or games, but through emotion. Through care. Through something (or someone) they’d want to return to.
Some things I’ve learned along the way:
- Start small, but don’t stay there. Breaking things down helped me move, even when I didn’t know what I was doing yet.
- Cross-platform design is hard but worth it. Having both mobile and headset work in harmony made the experience feel whole.
- Users will always tell you what you don’t need. I cut so many features I thought were essential, and the game got better every time.
- The emotional loop matters more than the feature loop. If people care about the world, they’ll come back for it.
And this is just the beginning.
What’s Next
There are so many directions Haru could grow from here. I’m currently exploring:
- A memory gallery system — where users can revisit their past photos in Haru’s world like a photo journal or scrapbook
- Emotional tagging — letting players assign feelings to their photos (cozy, proud, peaceful) to influence how Haru reacts
- Multi-Haru support — what if multiple players could connect their Harus and visit each other’s spaces?
- Mobile widget + notification system — to let Haru gently nudge players with new memory prompts throughout the day
- More personality AI — I’d love to explore simple language learning so Haru feels even more responsive over time
Ultimately, my dream is to make Haru something that doesn’t just live in mixed reality but helps people live with it. Gently. Daily. Emotionally.
If you’ve read this far, it probably means you’re a fellow XR nerd. 🧠💜
Haru is a strange little creature, but building him has taught me more about tech, play, and connection than I ever expected.
If you’re working on something similar (or want to), I’d love to hear from you.
Let’s make MR feel more human. 🐹