I am building an ML pipeline in Azure but it is failing when trying to invoke the endpoint with my model. The error reads:
Error Code: ScriptExecution.StreamAccess.NotFound
Native Error: error in streaming from input data sources
StreamError(NotFound)
=> stream not found
NotFound
Error Message: The requested stream was not found. Please make sure the request uri is correct.|
The first stage of the pipeline trains the model and works fine. Three parameters are defined in the pipeline yaml and passed to the python script, --input_folder (uri_folder: blob storage location of dataset) --config_file (parameters used in the python script, includes name of the dataset file), and --output_folder (uri_folder: artifacts saved in blob storage). Again, no problems training the model and uploading to the registry.
However, things are a little fuzzy when it comes to passing the parameters while invoking the model for testing.
The error suggests a possible permission issue, but I verified the UAMI has Storage Blob Data Contributor role for each storage account.
The cli command takes --input as an argument and I pass a json file containing the three parameters needed for the script.
{
"input_data": {
"uri_folder": "azureml://subscriptions/.../resourcegroups/.../workspaces/.../datastores/my_datastore/paths/my_folder/"
},
"parameters": {
"input_folder": "azureml://subscriptions/.../resourcegroups/.../workspaces/.../datastores/my_datastore/paths/my_folder/",
"output_folder": "azureml://subscriptions/.../resourcegroups/.../workspaces/.../datastores/my_store/paths/output/",
"config_file": "azureml:config_file@latest"
}
}