I've never tried this - so I don't know if I'd run into memory issues.
But can a SqlDataReader read a trillion records? It's all streamed correct? I'm a little green to what the SQL/TDS protocol is doing under the covers.
UPDATE
Translate Trillion to mean very large number. I probably should have said something like 1 billion or 100 million.
Yes, that will stream... but I don't think you should actually try to do it.
If you could read a million records per second (which sounds unlikely to me) you'd still need 12 days to read a trillion records... that's a lot of work to risk losing half way through.
Now I realise you probably don't really want to read a trillion records, literally, but my point is that if you can separate your "large amount" of work into logical batches anyway, that's probably a good idea.
Yes - it might take a while (as long as your SQL isn't doing anything silly trying to take a snapshot or anything), but if your server can stream it out, the SqlDataReader shouldn't have a memory usage problem.