最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

Ingesting data into Azure ML model deployment - Stack Overflow

programmeradmin1浏览0评论

I am building an ML pipeline in Azure but it is failing when trying to invoke the endpoint with my model. The error reads:

Error Code: ScriptExecution.StreamAccess.NotFound
Native Error: error in streaming from input data sources
    StreamError(NotFound)
=> stream not found
    NotFound
Error Message: The requested stream was not found. Please make sure the request uri is correct.|

The first stage of the pipeline trains the model and works fine. Three parameters are defined in the pipeline yaml and passed to the python script, --input_folder (uri_folder: blob storage location of dataset) --config_file (parameters used in the python script, includes name of the dataset file), and --output_folder (uri_folder: artifacts saved in blob storage). Again, no problems training the model and uploading to the registry.

However, things are a little fuzzy when it comes to passing the parameters while invoking the model for testing.

The error suggests a possible permission issue, but I verified the UAMI has Storage Blob Data Contributor role for each storage account.

The cli command takes --input as an argument and I pass a json file containing the three parameters needed for the script.

{
    "input_data": {
      "uri_folder": "azureml://subscriptions/.../resourcegroups/.../workspaces/.../datastores/my_datastore/paths/my_folder/"
    },
    "parameters": {
      "input_folder": "azureml://subscriptions/.../resourcegroups/.../workspaces/.../datastores/my_datastore/paths/my_folder/",
      "output_folder": "azureml://subscriptions/.../resourcegroups/.../workspaces/.../datastores/my_store/paths/output/",
      "config_file": "azureml:config_file@latest"
    }
  }
发布评论

评论列表(0)

  1. 暂无评论