A Sneak Peek into Building our Automated Win-Loss Product: BRIDGEfunnel

Keshakishore
BRIDGEi2i
Published in
5 min readJan 28, 2020

AI has been the buzzword of the decade and there are a lot of inventions under its umbrella. This article will give you a gist of what is happening behind such AI inventions and how we at BRIDGEi2i leveraged the Cloud to make AI a real possibility with BRIDGEfunnel.

BRIDGEfunnel is an AI-enabled automated win-loss analysis product built using advanced machine learning techniques, that enables sales reps to meet their quota numbers. To start building this game-changer product which delivers real-time recommendations at scale, we had to look at the problem area from four different personas

1.Data Handler: Understands the data landscape

2.Model Builder: Derives Insights by using AI/ML models

3.Data Distributor: Provides a doorway to access the generated insights.

4.Visualizer: Represents the insights via meaningful visualizations.

DATA HANDLER:

“Data is abundant” and all the AI discoveries ever made revolves around it. For us to generate meaningful insights out of the data, we would need meaningful information coming into the system which in-turn relies on data collection and data processing methodologies.

Meaningful data in = Meaningful insights out

Our data was coming from multiple sources which were of both batch and real-time in nature. These include data from CRMs, CSVs, flat-files, streams, etc and its format varied across all data sources.

Considering those we drew a baseline for our data model which,

1.Should be able to seamlessly integrate with multiple source systems.

2.Should be able to adapt to data changes.

3.Should ensure data integrity.

4.Should be highly available.

Obviously, we didn’t want to re-invent the wheel from scratch so naturally, the Cloud was our choice and by using their managed services we were able to eliminate the overhead of managing the Infrastructure. We went with AWS as our cloud provider due to their vast catalogue of out-of-the-box services.

Figure.1 gives us a high-level view of the technology choices for our data management layer. Since we were dealing with multiple data sources having only a structured or an unstructured database didn’t suffice our needs. So we went hybrid (Structured+Unstructured).

All the information that is core to our system which is static in nature are stored in a relational database and the data which are bound to change/dynamic in nature are stored in an unstructured database.

To ensure that algorithms could understand data, a good amount of data transformation was required and we chose Talend and Python scripts for doing this job. The choice was entirely based on the support for native integration with different data sources and the community. Next up is data security which was one of our major considerations while dealing with customer data. We achieved it by masking and encrypting critical information using our customized algorithms before making it available over the network via a secure server.

MODEL BUILDER:

This layer acts as the backbone of our product that is built on advanced AI/ML models which delivers real-time recommendations to the sales rep on the go. Since we deal with a lot of data, running our algorithms needed some serious compute power because of which we do a batch run of algorithms for all the historical/incremental data on a regular basis using AWS batch. We chose AWS batch due to its capability in processing large volumes of data at scale without worrying about the infra. The operationalization of this process is done by creating a security-hardened docker image that feeds into the batch layer via automated scripts. And in case of real-time requests, we use Sagemaker.

Monitoring and scheduling of these jobs are done using Cloudwatch which acts as a gatekeeper to start/stop batch jobs and monitors the health of the algorithm that runs via flow logs.

DATA DISTRIBUTOR AND VISUALISER:

Once the insights are made available in the datastore we needed a resilient mechanism that can deliver this information and an intuitive visualization layer to represent the discovered knowledge to hundreds of sales reps over different platforms.

So our design considerations for these two layers were,

  • They should go hand-in-hand providing easy integration between the source system and the client.
  • Minimal network transfer.
  • Cross-platform UI.
  • Resilient.
  • Effective Visualizations.

After a lot of research, we chose GraphQL for data distribution which satisfied the criteria set forth above, and also offered other advantages such as offline capabilities, type checking etc. And for developing the client we needed a technology that can be used for developing both web and mobile apps with minimal dev effort, and react/react-native provided exactly that. It allowed us to develop a cross-platform application that can be deployed to web and handheld devices without a lot of code change.

Our campaign and notification layer was powered by SNS which made us go live in a matter of hours and for visualizations, we chose D3 as it provides greater flexibility to adapt to changes and also has great support documents online.

FUTURE WORK:

AI keeps on evolving as does our product. We’re constantly improving our ML algorithms and user experience to better serve our customers and as the next step in our journey, we’re upgrading our conversational AI which can better understand the user’s context and provide personalized recommendations based on that. To know more, visit www.bridgefunnel.ai !

--

--