I am using Lucene to show search results in a web application.I am also custom paging for showing the same. Search results could vary from 5000 to 10000 or more. Can someone please tell me the best strategy for paging and caching the search results?
相关问题
- JCR-SQL - contains function doesn't escape spe
- Match lucene entire field exact value
- How to rank documents using tfidf similairty in lu
- Lucene Query on a DateField indexed by Solr
- How to token a word which combined by two words wi
相关文章
- Solr - _version_ field must exist in schema and be
- CakePHP with Lucene
- Apache Lucene doesn't filter stop words despit
- Sort by date in Solr/Lucene performance problems
- What Solr tokenizer and filters can I use for a st
- Solr: How to dynamically elevate limited number of
- Finding a single fields terms with Lucene (PyLucen
- how to add custom stop words using lucene in java
I would recommend you don't cache the results, at least not at the application level. Running Lucene on a box with lots of memory that the operating system can use for its file cache will help though.
Just repeat the search with a different offset for each page. Caching introduces statefulness that, in the end, undermines performance. We have hundreds of concurrent users searching an index of over 40 million documents. Searches complete in much less than one second without using explicit caching.
Using the
Hits
object returned from search, you can access the documents for a page like this: