Serving Models with TorchServe
Introduction
This tutorial demonstrates how to use TorchServe (a flexible tool for serving PyTorch models) for serving models on Neu.ro.
Before moving forward with the tutorial, make sure you have Neu.ro CLI installed.
Preparing Local Files
First, clone the TorchServe repository:
You can also use the following version of this command:
Then, copy the model you would like to serve to your local repository. In this example, we will use DenseNet:
You can now copy the files from your local repository to the platform storage:
Serving the Model
As you have all necessary files on the platform storage, you can mount them as a volume to your jobs. Run the following command:
Now, run a bash terminal from within this job:
Create the model-store
folder:
Run Torch Model Archiver on your model:
Before serving the model, run:
Now, you can serve your model. Here are the commands you can use for densenet161
:
Accessing the Management API
To access the serving job's management API, you would first need to port-forward the serving job:
After that, you can copy the densenet161
folder to your local machine:
Last updated