Tensorflow predict grpc not working but RESTful AP

2020-07-27 04:30发布

问题:

When i am trying to execute below piece of client code i am getting error but succeeded when calling via RESTful API end point
curl -d '{"signature_name":"predict_output","instances":[2.0,9.27]}' -X POST http://10.110.110.13:8501/v1/models/firstmodel:predict

Could you please correct me in below code

import tensorflow as tf
from tensorflow_serving.apis import predict_pb2
from tensorflow_serving.apis import prediction_service_pb2_grpc
import numpy as np
import grpc
server = '10.110.110.13:8501'
channel = grpc.insecure_channel(server)
stub = prediction_service_pb2_grpc.PredictionServiceStub(channel)
request = predict_pb2.PredictRequest()
request.model_spec.name = 'firstmodel'
request.model_spec.signature_name = 'predict_output'
request.inputs['input_x'].CopyFrom(tf.contrib.util.make_tensor_proto([12.0], shape=[1]))
result_future = stub.Predict(request,40.)
print(result_future.outputs['output_y'])

Got the below error message:

_Rendezvous: <_Rendezvous of RPC that terminated with:
status = StatusCode.UNAVAILABLE
details = "Trying to connect an http1.x server"
debug_error_string = "{"created":"@1545248014.367000000","description":"Error received from peer",
    "file":"src/core/lib/surface/call.cc","file_line":1083,"grpc_message":"Trying to connect an http1.x server","grpc_status":14}"

Below is the composed request information for your reference

model_spec {
  name: "firstmodel"
  signature_name: "predict_output"
}
inputs {
  key: "input_x"
  value {
    dtype: DT_FLOAT
    tensor_shape {
      dim {
        size: 1
      }
    }
    float_val: 12.0
  }
}

回答1:

GRPC port and HTTP port are different. Since you are listening for your http service on 8501, you GRPC service must use another port. The default is 8500 but you could change it with the --port= argument when you start your tf-server.

docker run -p 8500:8500 --mounttype=bind,source=/root/serving/Ser_Model,target=/models/firstmodel -e MODEL_NAME=firstmodel -t tensorflow/serving