Memory Leak — #19
--
VC Astasia Myers’ perspectives on machine learning, cloud infrastructure, developer tools, open source, and security. Sign up here.
🚀 Products
Dragonfly is a drop-in Redis replacement that can scale vertically to support millions of operations per second and terabyte sized workloads, all on a single instance. Dragonfly 1.0 comes with full support for Redis’ most common data types and commands, as well as snapshotting, replication and high availability. Dragonfly supports 7.5X higher throughput with replication when compared with Redis while the snapshotting phase is 12X faster.
Why does this matter? In our current TikTok era users have ever increasing expectations of application performance and responsiveness. Applications that are fast receive superior product reviews and outperform the competition, while those that feel sluggish fall behind. A site that loads in 1 second has a conversion rate 3x higher than a site that loads in 5 seconds and 5x higher than a site that loads in 10 seconds (Portent, 2022).
While data caching has been part of the application stack for decades, now more demands are put on it than ever before with the rise of globally distributed users and shorter attention spans. There is a need for speed. Dragonfly is a performant cache that solves these challenges.
GitHub Copilot X: The AI-Powered Developer Experience
GitHub Copilot is evolving to bring chat and voice interfaces, support pull requests, answer questions on docs, and adopt OpenAI’s GPT-4 for a more personalized developer experience. Less than two years since its launch, GitHub Copilot is already writing 46% of code and helps developers code up to 55% faster.
Why does this matter? Github Copilot is moving quickly to expand its product offering. With AI available at every step, teams can fundamentally redefine developer productivity. Github Copilot reduces boilerplate and manual tasks and making complex work easier across the developer lifecycle.
ChatGPT Plugins are tools designed specifically for language models with safety as a core principle, and help ChatGPT access up-to-date information, run computations, or use third-party services. Plugins can be “eyes and ears” for language models, giving them access to information that is too recent, too personal, or too specific to be included in the training data. In response to a user’s explicit request, plugins can also enable language models to perform safe, constrained actions on their behalf, increasing the usefulness of the system overall. The first plugins have been created by Expedia, FiscalNote, Instacart, KAYAK, Klarna, Milo, OpenTable, Shopify, Slack, Speak, Wolfram, and Zapier.
Why does this matter? Plugins connect OpenAI’s ChatGPT with the functionality and real-time nature of other websites. It helps data accuracy, relevancy, and recency. It also further enables LLM-powered automation workflows.
📰 Content
Docker has pulled the plug on its free subscription Docker Hub plan for teams, and, in doing so, caused quite a bit of chaos in the open source community.
“Free team organizations are a legacy subscription tier that no longer exists. This tier included many of the same features, rates, and functionality as a paid Docker Team subscription,” Docker wrote to the users of the subscription plan, urging them to upgrade to the paid plan (about $300/year), which offers the same capabilities as in Teams.
Why does this matter? While this change only impacted less than 2% of users, Docker issued a public apology for how it communicated and executed sunsetting Docker “Free Team” subscriptions, which alarmed the open source community. Docker clarified that public images will only be removed from Docker Hub if their maintainer decides to delete them. This event underscores how critical it is to clearly articulate product changes.
Steve Yegg of Sourcegraph discusses the rise of LLMs in the context of developer tools. They did an internal poll at Sourcegraph: Do you have positive sentiment or negative sentiment about LLMs for coding? Options were Positive, Negative, and Meh. And lo, it was about ⅔ Meh or Negative (i.e., Skeptics), which Yegg suspects is fairly representative of the whole industry. Yegg claims people are completely overlooking the fact that software engineering exists as a discipline because you cannot EVER under any circumstances TRUST CODE.
Why does this matter? Yegg’s piece is a light-hearted analysis of how LLM-enabled developer products can significantly improve developer productivity. He underscores that all the winners in the AI space will have data moats because the data moat is how you populate the context window (“cheat sheet”).
Bill Gates states artificial intelligence is as revolutionary as mobile phones and the Internet. He claims advances in AI will enable the creation of a personal agent.
Why does this matter? Gates’ piece reinforces many people’s belief that AI is a game changing technology that will transform our world. He notes AI enables the creation of a personal agent. We also think this extends to virtual teammates with domain expertise.
💼 Jobs
⭐️Claypot — Founding Engineer (Infra)
⭐️Grit — Design Engineer
⭐️Speakeasy — Founding UX Lead