Fundamentals

Building ML Systems the Right Way Using the FTI Architecture

The fundamentals of the FTI architecture that will help you build modular and scalable ML systems using MLOps best practices.

Paul Iusztin
Decoding ML
Published in
9 min readAug 10, 2024

--

Feature/Training/Inference (FTI) pipelines architecture

This article presents the feature/training/inference (FTI) architecture to build scalable and modular ML systems using MLOps best practices. Jim Dowling, CEO at Hopsworks, proposed the design [1, 2].

We will start by discussing the problems of naively building ML systems. Then, we will examine other potential solutions and their problems.

Ultimately, we will present the feature/training/inference (FTI) design pattern and its benefits. We will also understand the benefits of using a feature store and model registry when architecting your ML system.

Table of contents

  1. The problem with building ML systems
  2. The issue with previous solutions
  3. The solution: the FTI architecture
  4. Benefits of the FTI architecture

#1. The problem with building ML systems

Building production-ready ML systems is much more than just training a model. From an…

--

--

Paul Iusztin
Decoding ML

Senior ML & MLOps Engineer • Founder @ Decoding ML ~ Content about building production-grade ML/AI systems • DML Newsletter: https://decodingml.substack.com