Given rowParser
of type RowParser[Photo]
, this is how you would parse a list of rows coming from a table photo
, according to the code samples I have seen so far:
def getPhotos(album: Album): List[Photo] = DB.withConnection { implicit c =>
SQL("select * from photo where album = {album}").on(
'album -> album.id
).as(rowParser *)
}
Where the *
operator creates a parser of type ResultSetParser[List[Photo]]
. Now, I was wondering if it was equally possible to get a parser that yields a Stream
(thinking that being more lazy is always better), but I only came up with this:
def getPhotos(album: Album): Stream[Photo] = DB.withConnection { implicit c =>
SQL("select * from photo where album = {album}").on(
'album -> album.id
)() collect (rowParser(_) match { case Success(photo) => photo })
}
It works, but it seems overly complicated. I could of course just call toStream
on the List
I get from the first function, but my goal was to only apply rowParser
on rows that are actually read. Is there an easier way to achieve this?
EDIT: I know that limit
should be used in the query, if the number of rows of interest is known beforehand. I am also aware that, in many cases, you are going to use the whole result anyway, so being lazy will not improve performance. But there might be a case where you save a few cycles, e.g. if for some reason, you have search criteria that you cannot or do not want to express in SQL. So I thought it was odd that, given the fact that anorm provides a way to obtain a Stream
of SqlRow
, I didn't find a straightforward way to apply a RowParser
on that.
I ran into a similar situation but ran into a Call Stack Overflow exception when the built-in anorm function to convert to Streams attempted to parse the result set.
In order to get around this I elected to abandon the anorm ResultSetParser paradigm, and fall back to the java.sql.ResultSet object.
I wanted to use anorm's internal classes for the parsing result set rows, but, ever since version 2.4, they have made all of the pertinent classes and methods private to their package, and have deprecated several other methods that would have been more straight-forward to use.
I used a combination of Promises and Futures to work around the ManagedResource that anorm now returns. I avoided all deprecated functions.
A rather trivial usage of this function would be something like this:
There are, of course, drawbacks to this approach, however, this got around my problem and did not require inclusion of any other libraries.
I ended up creating my own
stream
method which corresponds to thelist
method:Note that the Play
SqlResult
can only be either Success/Error while each row can also be Success/Error. I handle this for the first row only, assuming the rest will be the same. This may or may not work for you.You're better off making smaller (paged) queries using
limit
andoffset
.Anorm would need some modification if you're going to keep your (large) result around in memory and stream it from there. Then the other concern would be the new memory requirements for your JVM. And how would you deal with caching on the service level? See, previously you could easily cache something like
photos?page=1&size=10
, but now you just havephotos
, and the caching technology would have no idea what to do with the stream.Even worse, and possibly on a JDBC-level, wrapping Stream around
limit
ed andoffset
-edexecute
statements and just making multiple calls to the database behind the scenes, but this sounds like it would need a fair bit of work to port the Stream code that Scala generates to Java land (to work with Groovy, jRuby, etc), then get it on the approved for the JDBC 5 or 6 roadmap. This idea will probably be shunned as being too complicated, which it is.You could wrap
Stream
around your entire DAO (where thelimit
andoffset
trickery would happen), but this almost sounds like more trouble than it's worth :-)