I am writing a client to consume a RestService and I have to read an entity out of the response and I am totally confused which method out of the two (getEntity
vs readEntity
) should be used.
I have to retrieve the entity whenever I get a WebApplicationException.
So, my code more or less looks like.
catch(WebApplicationException ex) {
// Do something with ex.getResponse
}
From, whatever I have tested,
ex.getResponse().hasEntity() ---> true
ex.getResponse().getEntity() == null ---> true
I don't know how it is working but if the first is true then how second statement could be true.
Surprisingly, readEntity worked fine for me and I was able to read the entity out from the response.
But, after reading the entity through readEntity,
this call gives false.
ex.getResponse().getEntity() == null --> false
Can someone help me understand what is really happening behind the scenes?
The
Response
class has two uses: server side and client side. On the server side, it's called the outbound response. On the client, it's inbound response.Outbound
Inbound
The
getEntity()
method is meant to be used on the server side because you want to get the entity object. There's not much use for it for us, but Jersey will use it to get the entity object to serialize it before it sends it out.The
readEntity()
method is to be used on the client side because you are trying to read the entity stream. If you try to call this on the server side, you will get an error saying that you can't read the stream on an outbound response.As far as the behavior you're experiencing, I can't really explain why they implemented like this.
This behaviour is documented in the API:
The first call to
ex.getResponse().getEntity()
is null, because the Entity hasn't been read yet. After callingreadEntity()
, the parsed entity will be returned as parsed bygetEntity()
.