Lessons Learned from Kambria’s Ethics and Open Source in AI and Blockchain Event
Last week Kambria hosted a Meetup titled “Why is Open Source & Ethics Important for AI and Blockchain Technology?” as part of DeveloperWeek — the largest developer conference in both the US and Canada with over 200,000 attendees. Our moderators, panelists, and audience members enjoyed a lively discussion on current-day problems and solutions related to ethics and open source in the AI and blockchain industries, including some of the very same dilemmas you’ve likely thought about yourself.
To be sure, the tech landscape is changing, and so is the way we relate to ethics, privacy, open source, licensing, and other issues. Many of the current models that dictate how we communicate and interact online are broken. How can we fix them and what tools are available to empower the user over the corporation? Our distinguished panelists shared their key insights with us and we’re sure you’ll enjoy hearing their opinions on these cutting-edge topics.
The ethics panel, hosted by Michelle Tsing, Founder, Robotics for Good and Kambria Advisor, included Adryenn Ashley (Founder & CEO, Loly Labs), X. Eyee (Sr. Technical Account Manager & Blockchain Ambassador, Microsoft); and Nadeem Mazen (Co-founder & CEO, FBRK). Everyone had great insights to share about how they’re currently using AI and blockchain in their technology and how that is impacting society.
At Loly Labs, AI is being used to go through preferences to figure out what customers on their AI-based dating platform really want. This is very cool considering many people don’t know what they want in a mate — so perhaps it’s better to let the AI decide! At Microsoft — which runs AI projects in every major industry in the world — the goal of developing AI solutions is to augment human intelligence rather than replace it. X. gave a great example of a grain manufacturer that uses cameras on their machines to help determine whether food is contaminated. Considering cereal crops form the basis of the diet of 3.5 billion people around the world, using this AI in processing technology can keep a lot of people healthy.
Permission is Not Consent
The conversation then moved to the topic of informed consent and what ethical uses of informed consent may look like. Loly Labs uses AI to improve communication so that customers know what they’re agreeing to. They utilize a “consent window” as part of their service, rather than static or blanket consent that may not be relevant or applicable to all dating circumstances. They key with consent, as Nadeem pointed out, is contextual understanding. Permission is not consent. We need to ensure users know the contexts for which they are providing consent. For example, you may agree to Amazon’s Alexa using your information for certain purposes when Amazon, in fact, has completely different plans for its use. Look at what you’re giving consent for and how much data you want to provide.
Another meaty topics the panelists took on was data ownership. What data is owned by the company versus the user? And should users get paid for content, rating, and opinions they share online? The panelists were particularly stirred by this subject, with Nadeem claiming, “I can’t believe we’ve accepted this situation” where users are not paid for their contributions on sites like Facebook. In his system, contributors can set their own prices and have complete control over how their data is used.
But the most vehement response to the current state of content usage came from X. Social media sites that farm content and make money are, in her opinion, engaged in “digital sharecropping.” We’re feeding these massive industries and getting some value out of it, but is the value really worth it? And how do we make a shift? Tap into this part of the discussion at 1:01 for X.’s examples and how blockchain may help.
Metadata is Terrifying!
The final major topic was the use of metadata. Have you ever thought about how much companies can learn about you, your lifestyle, and your behavior simply by analyzing your metadata? As our panelists discussed, a lot can be told by connecting these tiny fragments of information to what we already know about someone. “Metadata is terrifying!” warned X., who explained that these tidbits have always been used to paint whole pictures of people. Like most things, however, how we use it — for good or bad — is the real question. Using metadata for disaster relief is good. Using it to stalk the officials of a foreign government — probably not as good. To be sure, tons of information can be garnered about you based on the patterns of your behavior so being vigilant about who you let monitor that behavior is more important than ever.
Remove the “D” from Dapps
The second panel in our event shifted from ethics to open source. Our Moderator, Gideon Nweze, spoke with Jared Go (Co-founder & CTO, Kambria), Jinjing Liang (Google Software Engineer & Co-founder, ABC Blockchain Community), Tony Tran (Co-founder & CTO, Bee Token), and Steve Wei (Co-founder & CEO, TOP Network). The group was split on the best way to advance blockchain technology. Should we focus on infrastructure or create more Dapps?
Jinjing thinks we need both because we need the infrastructure to support the Dapps. That said, usable Dapps are what drive the use cases for blockchain. Tony thinks focusing on Dapps would result in wider adoption more quickly because the way we interact with blockchain is not intuitive or easy. On the other hand, Steve thinks we should remove the “D” from the “Dapp” because truly decentralized apps are too limited. Plus, they are not what people need. What people need is to leverage the benefit of blockchain, which is trust. You can still be a centralized app and tokenize your service, or you can be a hybrid app. BitTorrent, for example, is a decentralized app, but it’s not a blockchain app. We need to think out of the box about that blockchain apps look like.
What Company Will Dominate AI?
Who will emerge as the leader in AI? According to X., the company that will dominate the space will be the one that builds accessible tools most people can build on top of, where folks don’t have to build AI models from scratch and develop them individually. This will result in fewer barriers for companies to integrate AI into their technology. Jared, on the other hand, takes a more open-AI view — the more we work with AI, the better the models will be — and eventually, AI will start to design itself. Because AI has the potential of developing so quickly, the race to the top will probably be won by those with a headstart; AI will happen fast so it’s important to pay attention now in order not to be left behind.
Making Open Source Profitable
One final topic raised by an audience member was open source licensing. So glad she asked since, contrary to popular belief, open source does not mean “free of charge.” It also does not mean the user can use the technology for any purpose. As Michelle pointed out, when developing open source products, there is often an issue of mixing proprietary tech with open source tech. There are a number of open source licensing models that can be used based on a number of variables, such as how the developer or company wants to share information, how the tech will be used, and how the user is getting the license. For more information about why and how a company like Kambria would use open source licensing for robotics, check out our blog article here.
Want to learn more about AI, open source, and robotics? If you’re a female in the robotics industry — or you aspire to be — please join us at the Women in Robotics event, held in conjunction with Silicon Valley Robotics, during the last week of March. We’ll confirm the date shortly and will announce it on both our Telegram channel and our Facebook page, so be sure to follow us there for more information.
The Kambria Team
KAT is a token used on the Kambria platform.