Feature store repositories emerge as an MLOps linchpin for advancing AI


A battle for control over machine learning operations (MLOps) is beginning in earnest as organizations embrace feature store repositories to build AI models more efficiently.

A feature store is at its core a data warehouse through which developers of AI models can share and reuse the artifacts that make up an AI model as well as an entire AI model that might need to be modified or further extended. In concept, feature store repositories play a similar role as a Git repository does in enabling developers to build applications more efficiently by sharing and reusing code.

Early pioneers of feature store repositories include Uber, which built a platform dubbed Michaelangelo, and Airbnb, which created a feature store dubbed Zipline. But neither of those platforms are available as open source code. Leading providers of feature store repositories trying to fill that void include Tecton, Molecula, Hopsworks, Splice Machine, and, most recently, Amazon Web Services (AWS). There is also an open source feature store project, dubbed Feast, that counts among its contributors Google and Tecton.

It can take a data science team six months or longer to construct a single AI model, so pressure to accelerate those processes is building. Organizations that employ AI models not only want to build more of them faster, but AI models deployed in production environments also need to be either regularly updated or replaced as business conditions change.

Less clear right now, however, is to what degree feature store repositories represent a standalone category versus being a foundational element of a larger MLOps platform. As investment capital starts to pour into the category, providers of feature store platforms are trying to have it both ways.

Splice Machine, for example, offers a SQL-based feature store platform that organizations can deploy apart from its platform for managing data science processes. “It’s important to modularize the feature store so it can be used in other environments,” said Splice Machine CEO Monte Zweben. “I think you’ll see adoption of feature stores in both manners.”

Over time, however, it will become apparent that feature stores one way or another need to be part of a larger platform to derive the most value, he added.

Fresh off raising an additional $17.6 million in funding, Molecula is also positioning its feature store as a standalone offering in addition to being a foundation around which MLOps processes will revolve. In fact, Molecula is betting that feature stores, in addition to enabling AI models to be constructed more efficiently, will also become critical to building any type of advanced analytics application, said Molecula CEO H.O. Maycotte.

To achieve that goal, Molecula built its own storage architecture to eliminate all the manual copy-and-paste processes that make building AI models and other types of advanced analytics applications so cumbersome today, he noted. “It’s not just for MLOps,” said Maycotte. “Our buyer is the data engineer.”

Tecton, meanwhile, appears to be more focused on enabling the creation of a best-of-breed MLOps ecosystem around its core feature flag platform. “Feature stores will be at the center of an MLOps toolchain,” said Tecton CEO Mike Del Balso.

Casting a shadow over each of these vendors, however, are cloud service providers that will make feature store repositories available as a service. Most AI models are trained on a public cloud because of the massive amounts of data required and the cost of the graphics processor units (GPUs) required. Adding a feature store repository to a cloud service that is already being employed to build an AI model is simply a logical extension.

However, providers of feature store platforms contend it’s only a matter of time before MLOps processes span multiple clouds. Many enterprise IT organizations are going to standardize on a feature store repository that makes it simpler to share AI models and their components across multiple clouds.

Regardless of how MLOps evolves, the need for a centralized repository for building AI models has become apparent. The issue enterprise IT organizations need to address now is determining which approach makes the most sense today, because whatever feature store platform they select now will have a major impact on their AI strategy for years to come.

VentureBeat

VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact.

Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:

  • up-to-date information on the subjects of interest to you
  • our newsletters
  • gated thought-leader content and discounted access to our prized events, such as Transform
  • networking features, and more

Become a member



Source link

An Intro to HVAC IoT System Development Previous post An Intro to HVAC IoT System Development
Medium acquires ebook company Glose Next post Medium acquires ebook company Glose