GRAKN.AI @ LondonNewTech — Oct 12
This Wednesday, we did a second take at introducing GRAKN.AI (at that time, our name was still MindmapsDB). This time, it was at LondonNewTech, again at Google Campus London. Similar to our experience at TechHubTuesday, the audience was a mix of tech enthusiasts and developers, but this time around a few more devs were in the audience. As fate has it, Haikal was battling both a cold and a twisted ankle but was a trooper and carried on!
We were really impressed with the audience’s questions. Importantly, this demonstrated that the audience more fully understood what GRAKN.AI is, as compared to our experience at TechHubTuesday. That means that our communication is improving, which is wonderful! While we’re super confident about our technology — and its innovative advantages over what’s currently out there — we’re still learning the best way to engage with different audiences. When people begin to see why GRAKN.AI is truly innovative and useful to them, it makes us very happy! :)
A few of the standout questions we were asked, and their answers, include:
- What is the difference between GRAKN.AI and Neo4j?
- The ability to perform inference/reasoning is one of the key differentiators between us and “vanilla” graph databases.
- So what do we mean by reasoning? Our very own Michelangelo Bucci has written a blog post about what inference is and how to build it in GRAKN.AI.
2. How do we handle HA (“high availability”)?
- As avid open-sourcers, we are using Cassandra under the hood so our scaling capabilities are as good as theirs!
3. What is the largest graph that our tech can handle?
- We have loaded 300 million instances using a 4 node cluster the first week of release and we are still load testing it to see where the limit is!
If you’re interested in the 5 minute presentation and the ensuing Q&A, check out the Periscope stream at the 19 minute mark.
As usual, we are always looking for feedback and comments. Talk to us on Twitter (@graknlabs), by e-mail (info@grakn.ai) or via our Community page.