What we’re building: Foundations

Since joining Dessa earlier this year as the team’s Technical Product Specialist, I’ve been helping the company transform lessons we’ve learned deploying machine learning for enterprises into a platform purposefully built for engineering real-world AI systems. That platform is called Foundations, and we’re excited to finally share more about what we’ve been building.

So what is Foundations? In a nutshell, it’s an AI platform that:

  • Enables enterprise teams’ ability to scale AI’s value throughout every layer of their organizations
  • Helps machine learning teams deliver results more efficiently and effectively, with tools that streamline and augment each part of the machine learning workflow
  • Provides tools that offer a balance between standardization and flexibility for real-world AI deployments

Fuelling Dessa’s mission

Foundations is closely linked to Dessa’s mission, which hasn’t changed since the company deployed its first deep learning model more than two-and-a-half years ago. The company started with the view that enterprises — big companies — are the perfect places to build AI because they have lots of data and lots of challenging, exciting problems. This still rings true today, more than ever.

Excited to unlock AI’s real-world potential, our team’s on a mission to help enterprises build AI solutions that solve challenging business problems. In tandem, we’re striving to empower enterprises with technologies and organizational processes that lay the groundwork required for long-term results with AI.

A key way we’ve accomplished this so far is by designing and deploying customized AI solutions, helping enterprises build their machine learning capabilities from the ground up. By collaborating with enterprise teams, we’ve already helped to identify hundreds of AI use cases designed to solve specific business challenges. Our team of machine learning engineers has also worked closely with enterprise teams to put a diverse range of AI systems into production.

Since deployment, our models have already saved businesses millions of dollars, freed up more time for employees to work on important problems, and overhauled customer experiences for the better. In our spare time, the team’s also been able to work on some very cool projects, helping astronomers identify supernovae with deep learning and mentoring UofT’s self-driving car engineering team, to name only a few examples.

Working on these projects has meant encountering first-hand many of the challenges that stand in the way of successful real-world AI deployments. Learning how to overcome these obstacles the hard way, the team has emerged on the other side, transforming our lessons into tools that make the process of engineering AI for enterprise more streamlined, rigorous and impactful.

The motivation behind Foundations

Square peg, round hole

We started building Foundations because the tools available at the time were not well-adapted to the kinds of work we were doing in our enterprise deployments. Even though we had huge ambitions for what what we could help enterprises achieve with AI, we soon realized existing frameworks wouldn’t help us get to where our team and our clients were headed.

The lack of purpose-built tools for engineering enterprise-grade AI rang painfully clear every time we set out on a new client project. While it was by no means impossible to get the job done, each time we embarked on a new project, we had to build everything more or less from scratch. This took a long time — time we could have used much more effectively when helping many of our clients build their first AI systems.

More than just models

Another reason we’re building Foundations is based on the fact that machine learning involves many more components than just writing code. Many of these components fall outside the domain of AI experts’ actual strengths.

As this illustration shows, there are many moving parts to real-world machine learning solutions. The code makes up only a small part of it. (Image adapted from here)

Even though they can achieve what’s necessary to get results, it doesn’t make sense for machine learning talent (whose resources are already in scarce supply) to spend more time than necessary on tasks that have more to do with software development than training and testing models. With Foundations, we’ve built tools that streamline and automate many of the software engineering tasks associated with deploying AI solutions.

Using these tools in our collaborations with clients, our machine learning engineers have already been able to redirect time that would otherwise be spent on these tasks towards the parts of real-world AI that are most challenging, exciting and impactful.

From alchemy to a science

We’re also building Foundations to provide enterprise teams better ways of scaling AI with tools that standardize parts of the machine learning workflow. Still in the early phases of realizing AI through research and application, the majority of results produced in machine learning today have to date been produced by an array of highly individualized, kitchen-sink approaches.

Before Foundations, it was also easy for our machine learning engineers to operate like this. For every successful AI solution put into production, there are thousands of experiments involved, with efforts centred on finding the best algorithms, data and infrastructure that would optimize our models’ performance. With no standardized processes for engineering AI, each engineer tended to have their own way of doing things and had to manually keep track of their many different experiments’ results. Foundations provides tools that allow machine learning teams to standardize and automate the way these experiments are managed, saving a lot of time and allowing for much higher levels of transparency and collaboration.

The need for purpose-built tools to produce AI at scale in large organizations has long been recognized by tech’s biggest companies. Companies like Uber and Facebook have produced their own internal machine learning platforms to help their engineers forge ahead with AI applications, gaining competitive advantage in the process. If you look around, Big Tech’s investment in these tools (and correspondingly, in their talent) have helped these companies deliver the few truly transformative applications of AI the world has seen to date.

To scale AI successfully, enterprise teams will need the same kinds of purpose-built tools. Right now, 99% of enterprises don’t have them.

Traditional analytics platforms like SAS are often employed to put machine learning into production in enterprise today, which don’t have the functionality required for producing cutting-edge applications using techniques like deep learning. Without new tools, the talent employed to deploy machine learning will be severely limited in how much value they can unlock for their organizations. Products like Foundations will help enterprises empower their teams to unlock as much value as possible, giving them access to standardized yet flexible tools for putting advanced AI models into production.

Written by Ashwin Jiwane

Ashwin is Dessa’s Technical Product Specialist, and is leading the development of our AI platform Foundations. Before joining our team, he was one of the first engineering employees at PagerDuty, where he helped grow the company to its current valuation of $1.3 billion dollars. Originally from India, Ashwin first cultivated his technical training at the Indian Institute for Technology in Mumbai.

Want to learn more about Foundations?

Curious to know more about how our product can help your enterprise team scale AI? Discover more details about Foundations and how to get a demo on our website.

We’re also always looking to add experienced and passionate Software Engineers to the Foundations team. If this sounds like you, get in touch with us and tell us how you’d like to contribute to the building of an amazing team and amazing product.