When to definitely use SOLR over Lucene in a Sitec

2019-03-17 12:15发布

问题:

My client does not have the budget to setup and maintain a SOLR server to use in their production environment. If I understand the Sitecore 7 Content Search API correctly, it is not a big deal to configure things to use Lucene instead. For the most part the configuration will be similar and the code will be the same, and a SOLR server can be swapped in later.

The site build has

  • faceted search page
  • listing components on landing and on other pages that will leverage the Content Search API
  • buckets with custom facets

The site has around 5,000 pages and components not including media library items. Are there any concerns about simply using Lucene?

The main question is, when, during your architecture or design phase do you know that you should definitely choose SOLR over Lucene? What are the major signs that lead you recommend that?

回答1:

I think if you are dealing with a customer on a limited budget then Lucene will work perfectly well and perform excellently for the scale of things you are doing. All the things you mention are fully supported by the implementation in Lucene.

In a Sitecore scenario I would begin to consider Solr if:

  • You need to index a large number of items - id say 50 thousand upwards - Lucene is happy with these sorts of number but Solr has improved query caching and is designed for these large numbers of items.
  • The resilience of the search tier is of maximum business importance (ie the site is purely driven by search) - Solr provides a more robust replication/sharding and failover system with SolrCloud.
  • Re-purposing of the search tier in other application is important (non Sitecore) - Solr is a search application so can be accessed over HTTP with XML/JSON etc which makes integration with external systems easier.
  • You need some specific additional feature of Solr that Lucene doesn't have.

.. but as you say if you want swap out Lucene for Solr at a later phase, we have worked hard to make sure that the process as simple as possible. Worth noting a few points here:

  • While your LINQ queries will stay the same your configuration will be slightly different and will need attention to port across.
  • The understanding of how Solr works as an application and how the schema works is important to know but there are some great books and a wealth of knowledge out there.
  • Solr has slightly different (newer) analyzers and scoring mechanisms so your search results may be slightly different (sometimes customers can get alarmed by this :P)

.. but I think these are things you can build up to over time and assess with the customer. Im sure there are more points here and others can chime in if they think of them. Hope this helps :)



回答2:

Stephen pretty much covered the question - but I just wanted to add another scenario. You need to take into account the server setup in your production environment. If you are going to be using multiple content delivery servers behind a load balancer I would consider Solr from the start, as trying to make sure that the Lucene index on each delivery server is synchronized 100% of the time can be painful.



回答3:

I would recommend planning an escape plan from Lucene as early as you start thinking about multiple CDs and here is why:

A) Each server has to maintain its own index copy:

  1. Any unexpected restart might cause a few documents not to be added to the index on the one box, making indexes different from server to server. That would lead to same page showing differently by CDs
  2. Each server must perform index updates - use CPU & disk space; response rate drops after publish operation is over =/
  3. According to security guide, CDs should have Sitecore Shell UI removed, so index cannot be easily rebuilt from Control Panel =\

B) Lucene is not designed for large volumes of content. Each search operation does roughly following:

  1. Create an array with size equal to total number of documents in the index
  2. If document matches search, set flag in the array

While this works like a charm for low sized indexes (~10K elements), huge performance degradation is produced once the volume of content grows.

The allocated array ends in Large Object Heap that is not compacted by default, thereby gets fragmented fast.

Scenario:

  1. Perform search for 100K documents -> huge array created in memory

  2. Perform one more search in another thread -> one more huge array created

  3. Update index -> now 100K + 10 documents

  4. The first operation was completed; LOH has space for 100K array

  5. Seach triggered again -> 100K+10 array is to be created; freed memory 'hole' is not large enough, so more RAM is requested.

  6. w3wp.exe process keeps on consuming more and more RAM

This is the common case for Analytics Aggregation as an index is being populated by multiple threads at once. You'll see a lot of RAM used after a while on the processing instance.

C) Last Lucene.NET release was done 5 years ago.

Whereas SOLR is actively being developed.

The sooner you'll make the switch to SOLR, the easier it would be.