Comment on page

Fetching Models

Where it All Begins
The very first step in using ML in your app is fetching a model. NatML supports fetching models from different sources:

Fetching from Hub

NatML Hub is a platform for managing and deploying ML models.
The NatML predictor catalog.
NatML Hub provides a predictor catalog from which models can be fetched:
// Create an edge model
var model = await MLEdgeModel.Create("@natsuite/yolox");
You will need a NatML access key to fetch models from Hub. See this guide for how to get your access key.
When you upload your model to NatML Hub, we will automatically convert your model to CoreML, ONNX, and TensorFlow Lite, making your model cross-platform.
Predictors fetched from NatML are cached on-device, so your users only ever have to download the model once.

Using Model Files

NatML supports using CoreML (.mlmodel), ONNX (.onnx), and TensorFlow Lite (.tflite) models. Simply drag and drop the model file into your Unity project. The model file is imported as an MLModelData instance.
Dropping a CoreML model into Unity.
There are restrictions on what ML model files can be used on which platform. See the docs for more info.