Use Snowflake online feature store in production¶
Snowflake ML Feature Store helps manage your features throughout the process of feature engineering.
For online applications that require low-latency inference, use the online feature store to serve your features.
The following sections go through productionizing the process of retrieving features within your Python application. These sections have code examples that do the following:
Load the Iris dataset into Snowflake
Define the connection to Snowflake
Create the Feature Store and Feature Views
Retrieve the features and feature values
Generate predictions from your model
The code examples are written in Python. To go through this workflow for applications written in other languages, use a Snowflake driver that’s specific to that language. For more information, see Drivers.
Prerequisites¶
To run online ML feature retrieval in Snowflake, you need the following:
Data that you’ve already loaded into Snowflake
A Snowflake feature store
Feature views
Online feature serving enabled for each feature view
You can use features from your own Snowflake feature store, but you can use the following code to load the Iris dataset into Snowflake if you don’t already have a feature store.
After you have the data in your environment, you create the feature store. The following code creates a feature store and the
id_entity entity for the different samples from the Iris dataset.
Note
Snowflake ML Feature Store has the concept of entities. Entities are keys that organize features between feature views. For more information about entities, see Working with entities.
After you’ve created the feature store, you define the feature views. The following code defines the sepal and petal feature views from the Iris dataset.
Retrieve the feature values¶
After you’ve registered the feature views and enabled online feature serving for each feature view, you can have the feature values from each feature view served to your application.
To retrieve the feature values, you do the following:
Set up a connection to Snowflake
Create the session and Snowflake Feature Store objects that initialize when the application starts
Retrieve the features from your feature views
Create a prediction endpoint and get predictions from that endpoint
Important
You must install snowflake-ml-python>=1.18.0 into your application’s environment to use the Feature Store API.
To connect to Snowflake from your application, you must set up either a Programmatic Access Token (PAT) or key-pair authentication as an authentication method.
Configure the client¶
When you initialize your application, it must connect to Snowflake ML Feature Store API and create the required Feature Store Python objects.
Use the following sections to configure your client’s connection to the Snowflake ML Feature Store API.
Configure a Programmatic Access Token (PAT)¶
Specify the following connection parameters in the following code to connect to Snowflake from your application:
schema- the name of the Snowflake feature storedatabase- the database containing the schema or feature storerole- the role required to read from the feature store. For more information, see Provide access to create and serve online features.password- your PAT.
Specify the following connection parameters in the following code to connect to Snowflake from your application:
schema- the name of the Snowflake feature storedatabase- the database containing the schema or feature storerole- the role required to read from the feature store. For more information, see Create and serve online features.private_key_file- the private key fileprivate_key_file_pwd- the password to the private key file
Create the Session and Feature Store Objects
After you’ve defined your connection parameters, you create the session and Feature Store objects that your application uses to connect to Snowflake.
The following code:
Creates the Snowflake Session, the client that your application uses to communicate with Snowflake.
Configures a thread pool executor to enable feature retrieval parallelism.
Lists the features that we’re retrieving from the feature store.
Initializes the feature store reader client. This object wraps the Snowflake session. It’s the main way your application interacts with the feature store.
Initializes the feature views that you’ve defined. You can replace these with your own features.
Retrieve the online features on the serving path¶
After you’ve defined how the application initializes, you can create a prediction endpoint.
There are different ways where you can define how your application handles requests. The following Python code:
Defines the prediction endpoint in your application
Takes the keys from the JSON request
Uses the keys to retrieve the feature values from the feature views
Passes those feature values to the model
Gets the predictions from the model
Returns the predictions in the response
The preceding code calls a hypothetical run_inference function. Your own inference function could get predictions from your model
regardless of whether it’s hosted remotely or in application memory.
The prediction endpoint in the preceding code accepts a key and returns the prediction for that key. Your data might have multiple keys characterizing a single sample. The preceding code is meant to be an example that you can adapt to your own use case.