We are looking to have a Solr 4.9 setup where we have a very simple crawler wipe out and load up a "crawler" core then trigger a copy of the data over to "search" core when the crawl is done. The purpose of this is that our crawler is VERY simple, and doesn't really track documents in a way that would be conducive to doing updates and deleted. Basically, the crawler will be wiping out the entire "crawler" core, ripping though about 50k documents (committing ever 1000 or so), and then trigger something to copy over the data to the other "search" core.
Assuming we would have to restart the Search core, how could this be made possible from command-line or code?
Create a third core as a copy of the
search
core. Then use themergeindexes
command in CoreAdmin to merge two different cores into the third one. After the merge finishes, swap the the third core with the oldsearch
core. Then UNLOAD the swapped out core (withdeleteInstanceDir=true
if you're feeling that you can permanently remove the old data).Something like: