[English] Unlock the Power of Machine Learning with Feature Store and Vertex AI

[English] Unlock the Power of Machine Learning with Feature Store and Vertex AI

Table of Contents

  • Introduction
  • Background
  • Machine Learning Infrastructure
    • Overview
    • Core Function
    • Supporting Components
    • Automation of Execution
    • Version Control
    • Monitoring
  • Pros and Cons of Using Feature Store
  • Future Initiatives
  • Conclusion


In this article, we will explore the concept of machine learning infrastructure using feature store and Verdicts AI. We will discuss the background that led to the creation of this infrastructure, its implementation, and the future initiatives. This machine learning platform aims to address the challenges of developing and operating multiple machine learning models in a cost-effective manner. We will also examine the benefits and drawbacks of using a feature store in this context. So, let's dive in and explore the world of machine learning infrastructure.


Various machine learning models have been developed to prevent abusive services, such as fraud detection models and chargeback models. However, the rapid evolution of fraud methods requires a fast pace of model development and updates. As a result, the number of features required for these models has significantly increased over time. The data sources for these features are disparate, including big query tables, files on GCS (Google Cloud Storage), and spreadsheets. To manage the increasing complexity of managing multiple models and reusing features, a robust machine learning infrastructure is essential.

Machine Learning Infrastructure


The machine learning infrastructure platform consists of three main components: data source, feature store, and Vertex AI. The data source provides information from various data stores and sources, which are then stored and managed in the feature store. Vertex AI is used for training and inferring the models. The output of the models is stored for subsequent use.

Core Function

The core function of the machine learning infrastructure platform is the acquisition, training, and inference of models. The feature store serves as a bridge between the data source and the models, providing the necessary features for training and inference. Vertex AI is used to train and infer the models effectively.

Supporting Components

In addition to the core function, the machine learning infrastructure platform includes several supporting components. These components include version control, automation of execution, and monitoring. Version control enables the management of multiple models and components by separating them into different versions. Automation of execution allows for on-demand or periodic execution of ML pipelines. Monitoring ensures the continuous monitoring of the system and models for optimal performance.

Automation of Execution

The automation of execution is a crucial aspect of the machine learning infrastructure. By implementing automation, the operational costs can be significantly reduced, and new models can be developed and released quickly. The executor component enables the automatic execution of ML pipelines with the touch of a button. The cloud scheduler and cloud run are used for on-demand and periodic execution, respectively.

Version Control

Version control is another vital component of the machine learning infrastructure platform. It allows for the management of different versions of pipelines and components, ensuring seamless updates without affecting other models. By separating versions on Google Cloud Storage (GCS), different versions of common components and pipelines can coexist, making it easier to update and modify the system.


Monitoring ensures the continuous monitoring of the system and models. By aggregating the performance of models from BigQuery and providing real-time notifications via Slack, any potential problems or anomalies can be quickly identified and addressed. This proactive approach to monitoring helps to maintain the optimal performance of the machine learning infrastructure.

Pros and Cons of Using Feature Store

The use of a feature store in machine learning infrastructure has its benefits and drawbacks. Let's explore them in detail:


  1. Centralized Management of Features: A feature store provides a centralized repository where features for machine learning models can be stored, managed, and accessed easily. This eliminates the need for separate data processing for each data source and allows for the reuse of features across multiple models, reducing development costs.

  2. Division of Labor: With a feature store, ML engineers can focus on developing models, while data engineers can focus on generating and managing features. This division of labor simplifies the development and operation processes, making them more efficient.

  3. Fast Development and Deployment: By reusing existing features and leveraging the feature service, models can be developed and deployed quickly. This rapid development cycle allows for faster response to evolving fraud methods and reduces time-to-market.

  4. Streamlined Data Drift Detection: The feature store can facilitate the detection of data drift by comparing the statistical distribution of features or labels between baseline data and real-time data. This helps to identify changes in data Patterns and trigger retraining processes.


  1. Compatibility with Open Source: Feature store platforms like Feast are actively evolving, and keeping up with the frequent updates can be challenging. Additionally, the lack of customer support and some features being in alpha status can pose operational challenges.

  2. Integration with Existing Pipelines: Integrating a feature store into existing ML pipelines may require modifications to the existing workflow. Ensuring smooth integration and compatibility can be complex, especially if the pipelines and components are diverse and decentralized.

Future Initiatives

While the machine learning infrastructure using feature store and Verdicts AI has already proven to be effective, there are still areas that can be improved and future initiatives to be pursued. The following are some future initiatives:

  1. Extension of the Feature Store: The feature store can be extended to incorporate stream ingested data, such as Kafka, to enable real-time inference and improve responsiveness to evolving fraud methods. This requires compatibility with new features and ensuring stability in production environments.

  2. Automation of Data Drift Detection and Retraining: Implementing automation for data drift detection and retraining processes can enhance the system's adaptability. This includes incorporating monitoring functionalities to detect data drift, notifying Relevant stakeholders, and automating the retraining process.

  3. Continuous A/B testing: Adding continuous A/B testing capabilities can help evaluate model performance and facilitate seamless model updates. This involves deploying multiple models to a single endpoint and gradually increasing their traffic to monitor their impact and performance.

  4. Efficient Handling of Offline and Online Data: Optimizing the management and alignment of offline and online data is crucial to ensure the accuracy and timeliness of inference results. Developing efficient strategies to handle data collisions and duplications between offline and online stores will be a focus in the future.


Machine learning infrastructure, powered by feature store and Verdicts AI, offers a robust and cost-effective solution for developing and operating multiple machine learning models. The centralization of feature storage, automation of pipeline execution, and effective monitoring contribute to the efficiency and effectiveness of the system. While there are challenges and areas for improvement, the benefits of using a feature store outweigh the drawbacks. With future initiatives focused on extending the feature store capabilities and further automation, the machine learning infrastructure will continue to evolve and adapt to the ever-changing landscape of fraud prevention.


Most people like

Find AI tools in Toolify

Join TOOLIFY to find the ai tools

Get started

Sign Up
App rating
AI Tools
Trusted Users
No complicated
No difficulty
Free forever
Browse More Content