- Notifications
You must be signed in to change notification settings - Fork7k
Error "An error occurred (ModelError) when calling the InvokeEndpoint operation"#4747
Uh oh!
There was an error while loading.Please reload this page.
Uh oh!
There was an error while loading.Please reload this page.
-
Error 500 when consuming SageMaker endpoint deployed with MLflowDescriptionHello everyone, Steps to Reproduce
importsagemakerimportboto3importnumpyasnpsagemaker_session=sagemaker.Session()endpoint_name='endpoint_name'# SageMaker Runtime clientclient=boto3.client('sagemaker-runtime')payload=bytes(np.array([1.0,2.0,3.0,4.0]).reshape(1,-1))response=client.invoke_endpoint(EndpointName=endpoint_name,ContentType='application/json',Body=payload)print(response) Error ModelError:Anerroroccurred (ModelError)whencallingtheInvokeEndpointoperation:Receivedservererror (500)fromprimarywithmessage"<!DOCTYPE HTML PUBLIC "-//W3C//DTDHTML3.2Final//EN"><title>500InternalServerError</title><h1>InternalServerError</h1><p>Theserverencounteredaninternalerrorandwasunabletocompleteyourrequest.Eithertheserverisoverloadedorthereisanerrorintheapplication.</p> Questions How can I consume an endpoint deployed from SageMaker with MLflow in another notebook or service (e.g., Lambda)? |
BetaWas this translation helpful?Give feedback.
All reactions
Replies: 1 comment
-
@sergioCancanEs Make sure your deployed endpoint is healthy as SageMaker checks for health first and then make sure your lambda function has sagemaker: invoke endpoint permission added to the IAM role attached to it. |
BetaWas this translation helpful?Give feedback.