In some code I'm writing for GAE I need to periodically perform a GET on a URL on another system, in essence 'pinging' it and I'm not terribly concerned if the request fails, times out or succeeds.
As I basically want to 'fire and forget' and not slow down my own code by waiting for the request, I'm using an asynchronous urlfetch, and not calling get_result().
In my log I get a warning:
Found 1 RPC request(s) without matching response (presumably due to timeouts or other errors)
Am I missing an obviously better way to do this? A Task Queue or Deferred Task seems (to me) like overkill in this instance.
Any input would appreciated.
How long does it take for the async_url_fetch to complete and how long does it take to provide your response?
Here is a possible approach to leverage the way the api works in python.
Some points to consider.
Many webservers and reverse proxies will not cancel a request once it has been started. So if your remote server you are pinging cues the request but takes a long time to service it, use a deadline on your create_rpc(deadline=X) such that X will return due to timeout. The ping may still succeed. This technique works against appengine itself as well.
GAE RPCs
To recap.
Prepare the long running url_fetch with a reasonable deadline and callback. Enqueue it using make_fetch_call. Do the work you wanted to for the page. Return the page regardless of wether the url_fetch completed or deadlined and without waiting for it.
The underlying RPC layer in GAE is all asynchronous, there seems to be a more sophisticated way to choose what you wish to wait on in the works.
These examples use sleep and a url_fetch to a second instance of the same app.
Example of wait() dispatching rpc work:
Wait called after sleeping for 4 seconds shows dispatch of
Async dispatched call.
Showing using a memcache RPC's wait to kick off the work.
Appengine Prod Log:
Async url fetch dispatched when memcache.get calls wait()
A task queue task is your best option here. The message you're seeing in the log indicates that the request is waiting for your URLFetch to complete before returning, so this doesn't help. You say a task is 'overkill', but really, they're very lightweight, and definitely the best way to do this. Deferred will even allow you to just defer the fetch call directly, rather than having to write a function to call.