We were hacked.
Something special happened on DIY this week. Tongue-in-cheek, you could say we were hacked. To my delight it demonstrated the ingenuity of children, and also how “lean” product development – leaning on feedback from users – can succeed.
By way of some background, DIY is a community for kids and we're obligated to comply with the FTC's Children’s Online Privacy Protection Act (COPPA), which specifies the sort of personal information we can collect and store from our members under the age of 13. Basically, we can't ask them, or allow them to share, their name, location and face without getting their parents' permission. So, we're pretty conservative when building social features that could expose us to open-ended sharing. For this reason it took many months of consideration before we deployed a comment feature on project pages, making it possible for our members (called “Makers”) to exchange public messages.
Tens of thousands of comments have been made since, and they’re the sort you’d expect from nine year olds. Lots of creative spelling, heavy use of emojis, but mostly brief commentary equivalent to a thumbs up. Though, slowly and surely, we could see friendships forming, first by making requests of each other (”You should try this!”), and then as we hoped these requests were fulfilled. In effect, social feedback became a primary drive for kids to make. Our Makers suddenly became prolific. They make things as social gestures to each other, but also so each new project page can become a campfire for their DIY friends to sit around and chat.
The complexity, and one might say the dark side, of this is that it exposes us to risk. We’re responsible for each kid’s privacy as well as the menial job of keeping potty mouths from saying anything foul. To keep pace my collaborator Andrew Sliwinski wrote and released some open source software called Troll that performs language sentiment and semantic analysis to flag problematic comments, which we double check manually (Large operations, similar to call centers, now offer services to monitor comments for nefarious activity. The hearsay is that Club Penguin employs 200+ people in British Columbia to do this full-time).
While we took time to plan a scalable solution to comment filtering, we tabled plans to introduce a real-time chat experience to encourage Maker clubhouses to form, because these would generate a high volume of comments that we couldn’t responsibly monitor. And this is where the kids outsmarted us. So eager to chat with his DIY friends, a user named Diode shared a new project that was simply an image of the text “Chat Below”. Effectively, he created DIY’s first chat room. And the coolest thing about it? The first discussion among the kids centered on new features they wanted the DIY staff to build, and new skills they wanted to learn. As soon as we saw it all of us in the office jumped in to participate. They found their own way to give us feedback. The whole experience seemed to me how the back and forth of community and feature development should go.
Lots of copycats followed suit – other makers are adding their own ‘Chat Room’ projects to their portfolio – so for better or worse real-time-chat and monitoring has been bumped up to an earlier roadmap milestone.