Apex Haus
Published in

Apex Haus

Sync Computing and the Future of Distributed Compute

Founded in 2020, Sync Computing is engineering a new paradigm in cloud orchestration.

Authored by: Chelsea Goddard, Stuart Farris, and Stephanie Wenclawski

Sync promises to optimize cloud resources at every step of computation.
  • Sync Computing aims to revolutionize how companies orchestrate their compute with the invention of the Sync Platform.
  • Their patented technology has led to a breakthrough in solving complex optimization problems.
  • With the development of their Optimization Processing Unit, which can outperform modern chip architectures the team aims to disrupt the world’s biggest industries including scientific simulations, financial modeling, machine learning and more.

Intro

When we think about modern industry it is often in the context of services. A rideshare company makes it convenient for someone to quickly and efficiently order a car “on-demand.” Major airlines are increasingly offering opportunities to travel for cheaper given a specific set of constraints (time window, trip duration etc.). Power utility companies continue to focus on efforts related to dynamic pricing initiatives. The list goes on. Yet, if we were to step back and pose the question to the modern day investor: what do a rideshare company, a major airline, and a power utility all have in common? We wonder how many would answer with the critical technical competency that informs the success of the business: chiefly, solving complex Combinatorial Optimization problems.

Problem

Combinatorial Optimization (CO) is a “subfield of mathematical optimization” that aims to use emerging mathematical techniques to solve “discrete optimization problems.” Simply put, these optimization problems consist of “finding the optimal object given a set of many objects to choose from” [1]. For example, the rideshare company needs to pair customers hailing rides and active drivers in such a way that minimizes the total distance traveled, enabling the riders to spend the least amount of time in transit and minimize the miles driven by the driver (or autonomous vehicle, depending on your level of optimism). The airline needs to ensure every flight has a full crew of pilots, co-pilots, and cabin personnel each with different home bases, flight time limitations, and international work regulations. The power utility must distribute incoming electrons from a variety of sources (photovoltaics, wind turbines, nuclear plants, etc) to millions of customers eager to flip the switch on their A/C unit or plug-in their electric vehicle. Optimization problems like these all fall under the umbrella of CO and, arguably, play a critical role within every sector of the modern economy. The company that solves their flavor of CO problems more accurately and more efficiently than their competitors will almost always be more productive and more profitable.

Infographic depicted above shows a classic NP-hard problem in combinatorial optimization: the Traveling Salesman Problem.

Since CO challenges critically impact operations management, a significant research effort in both industry and academia is dedicated to improving how we conceptualize, approach, and ultimately solve them. The conventional method to solving CO problems is to harness the latest and greatest hardware (CPU or GPU chips) coupled with an intelligent new algorithm. Today, entire services departments at SaaS companies exist on the sole promise of solving your CO problem faster than ever, often failing to do so. One of the main causes for this gap (between promise and actualization) is that algorithms can only approximate the solution, and problems are continuing to scale faster than modern chip architectures can keep up. This constraint is not new and is commonly referred to in the public discourse as Moore’s Law.

With CO problems, we are often trying to choose the best object from a large group of objects and the only way to do this is by directly comparing these objects to one another. Herein lies the rate-determining step of CO problems: the speed at which we can compare objects. That is why solving CO problems is limited by the number of transistors we can fit onto traditionally integrated circuits (also known as CPUs and GPUs). What this means is despite the voracious appetite of companies to solve larger and more complex problems, even the best algorithmic solutions are limited by the hardware they run on. Put another way, despite the number of transistors that can fit on modern chips doubling every two years, traditional chip hardware is not improving fast enough.

Solution

Enter Suraj Bramhavar, Jeffrey Chou, and their ambitious startup Sync Computing. Rather than relying on integrated circuits to catch up to Moore’s predictions, these MIT alum are forgoing traditional integrated circuit architectures in favor of their own chip, dubbed the Optimization Processing Unit (OPU). The OPU is specifically designed with combinatorial optimization problems in mind “where the number of possible combinations is too large for traditional computers to search through efficiently.” In an abstract sense, they are creating a chip that mimics the interconnected graphs we use to visualize problems like the Traveling Salesman Problem; “this new paradigm draws inspiration from nature, where complex systems routinely sift through an endless sea of combinations to find satisfying solutions.”

As CO problems scale to non-trivial sizes, the performance of the OPU overtakes that of CPUs and GPUs and can solve problems in orders of magnitude less time. This is illustrated in Chou and Bramhavar’s Nature article, Analog Coupled Oscillator Based Weighted Ising Machine, where they compare the performance of their prototype against a GPU in solving a common CO benchmark, the Max-cut problem. Based on their proof of concept the system is capable of successfully solving random MAX-CUT problems with 98% success probability with binary weight and can be scaled using existing low-cost hardware that lends itself especially well to an integrated circuit implementation. While similar approaches to solving CO problems have been attempted with quantum computers, scaling quantum computations beyond the laboratory has proven exceptionally difficult. In contrast, Sync aims to act as a layer between existing computational frameworks by leveraging inexpensive off-the-shelf electronics components.

(a) Circuit diagram of the LC oscillator circuit. The input coupling is inserted at the gates of transistors M3 and M4. (b) Multiply and accumulate cross-bar array composed of digital potentiometers.

Market & Strategy

High Level Overview

The overall market is the $300 billion cloud ecosystem where players compete on costs, performance, and services. The global cloud computing market size is expected to grow from USD 371.4 billion in 2020 to USD 832.1 billion by 2025, at a Compound Annual Growth Rate (CAGR) of 17.5% during the forecast period. The team plans to first sell directly to organizations focused on scientific computing ($1 billion), then data analytics ($10 billion), and lastly web services ($50 billion).

Sync long term plan & TAM Growth as of 2020.

In-depth analysis

When examining the cloud computing ecosystem, performing a SWOT analysis will help to better understand the market dynamics of the ecosystem. Starting with the strengths, the following can be observed:

  1. Cloud storage enables enterprises to reduce costs. Enterprises experience a high cost to host data on premises and cloud storage reduces the infrastructure and storage costs. Furthermore, organizations can reap benefits from the pay-as-you-go model.
  2. Cloud technology adoption is expected to continue to increase and be important as the work from home trend continues due to COVID-19. A rapid expansion of cloud technologies was seen during the pandemic and is only expected to continue as businesses work to sustain enterprise business functions.

The cloud computing ecosystem exposes the following weakness that entrants into the market need to be aware of:

  1. Regulatory and compliance policy needs present a challenge to market participants. With the rise in data security concerns, additional actions must be taken to protect enterprise data. Regulatory and compliance is increasingly complex and needs to be met to avoid financial penalties.

Looking forward to opportunities in cloud computing, the following can be realized:

  1. A trend currently exists to move toward the adoption of cloud computing services. The hybrid approach allows enterprises with existing infrastructure to move towards adopting cloud computer services.
  2. Growth opportunities will be realized in APAC. Much of the cloud computing market share is currently dominated by North America. Skilled labor and a focus on smaller organizations (referred to as SMEs) will help to expand in the APAC region. Additionally, public cloud services have gained substantial traction in the region.
  3. Acquisitions and product launches present a lucrative opportunity. With increased agility in the market already expected to drive cloud computing, additional products are continually launching. These launches are expected to facilitate acquisitions in the next five years. An example of this can be seen in the October 2018 Red Hat acquisition by IBM to help IBM gain access to a larger customer base and a more extensive cloud portfolio.

Threats market players should be aware of guide organizational decisions. Threats to the cloud computing market include:

  1. Cyberattacks are ever increasing and could result in business losses and shutdowns. A rapid increase in cyberattacks is hurting business operations and causing critical data loss.
  2. A few large players own a substantial percentage of the market cap. If a market entrant does not sell their tech to a large player, this will have a big effect on their projections and success in the market.

The high-performance computing market was valued at $29.37B in 2017 and is expected to reach a value of $43.97B billion by 2023 (CAGR of 12.28%). The data analytics market is expected to grow at a CAGR of 30.08% and reach an estimated $77.64B by 2023. The key players in this market include Microsoft Corporation, Amazon Web Services, SAP, Oracle Corporation, Dell Inc., SAS Institute Inc., Alteryx, Inc., Looks Data Sciences, Inc., Datameer Inc., and IBM Corporation. Web services are currently battling for market share. Artificial intelligence, analytics, IoT, and edge computing will be differentiators among the top cloud service providers. A multi-cloud theme is being promoted among legacy vendors that have created platforms that can plug into multiple clouds. Amazon Web Services alone made up $35.026B of the web services market in 2019 with a CAGR from the past 7 years of 41%. Amazon Web Services is followed closely by Microsoft Azure, and Google Cloud Platform. Notably, Amazon Web Services and Microsoft are huge players in this space and present a large threat to market entrants. Given more time, the next step in this market analysis would be to further confirm the CAGR for scientific computing, data analytics, and web services markets. With this information, I would confirm that the order of market entry makes sense by completing a market sizing and compare the current market size to the total addressable market (TAM).

Go-to-market Strategy Recommendations

Looking at CAGRs and TAMs for various market segments, our recommendations is that Sync switch the priority for which they sell directly to organizations and begin with data analytics, followed by web services, and lastly scientific computing. Data analytics has a high CAGR and Sync predicts they could get a good portion of the market. In the web services space, the CAGR for the leader, Amazon Web Services, is higher at 41% but given the fact that Amazon’s market share was $35B in 2019 and Sync predicts they can capture $50B of the market, this may be a stretch. To start with smaller organizations as a point of entry would be a better approach. Since the CAGR for scientific computing is 12% and they believe they can only capture 4% of the market, starting with such a small size and smaller comparable CAGR may not be the best approach for a return on their investment.

Since the cloud computing ecosystem is dominated by a few large giants, we would further recommend that Sync license its IP initially targeting the market leaders. With a market dominated by a few concentrated players and not much fragmentation the cost of entry is high and to get a sizable market share will be difficult. Through licensing to large players such as Amazon, even capturing a sliver of Amazon’s revenue could be profitable. This approach should be tested and further built out through understanding how direct to consumer original equipment manufacturers have gone to market in this space in the past. For example, when approaching a player like Amazon, the go-to-market strategy should convey the different pain points Sync is attempting to solve for and provide a value and message for each pain point.

Closing Thoughts

As The Engine writes, “in an era where the demand for computing power continues to increase with no end in sight, Sync Computing is poised to meet the need by unlocking the true potential of the cloud”. We look forward to watching this team evolve over the coming years!

Thank you for reading! This story was authored independently from Sync Computing as well as Sync’s investors and represent the thoughts and opinions of the named authors.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store