Publish, export and unpublish a Custom Vision model iteration
Publish, export and unpublish a Custom Vision model iteration
publish_model(model, name, prediction_resource)unpublish_model(model, confirm =TRUE)export_model(model, format, destfile = basename(httr::parse_url(dl_link)$path))list_model_exports(model)
Arguments
model: A Custom Vision model iteration object.
name: For publish_model, the name to assign to the published model on the prediction endpoint.
prediction_resource: For publish_model, the Custom Vision prediction resource to publish to. This can either be a string containing the Azure resource ID, or an AzureRMR resource object.
confirm: For unpublish_model, whether to ask for confirmation first.
format: For export_model, the format to export to. See below for supported formats.
destfile: For export_model, the destination file for downloading. Set this to NULL to skip downloading.
Returns
export_model returns the URL of the exported file, invisibly if it was downloaded.
list_model_exports returns a data frame detailing the formats the current model has been exported to, along with their download URLs.
Details
Publishing a model makes it available to clients as a predictive service. Exporting a model serialises it to a file of the given format in Azure storage, which can then be downloaded. Each iteration of the model can be published or exported separately.
The format argument to export_model can be one of the following. Note that exporting a model requires that the project was created with support for it.
"onnx": ONNX 1.2
"coreml": CoreML, for iOS 11 devices
"tensorflow": TensorFlow
"tensorflow lite": TensorFlow Lite for Android devices
"linux docker", "windows docker", "arm docker": A Docker image for the given platform (Raspberry Pi 3 in the case of ARM)