I have a 150 MB XML file which is used as DB in my project. Currently I'm using XmlReader
to read content from it. I want to know if it is better to use XmlReader
or LINQ to XML for this scenario.
Note that I'm searching for an item in this XML and display search result, so it can take a long time or just a moment.
If you want performance use XMLReader. It doesn't read the whole file and build the DOM tree in memory. It instead, reads the file from disk and gives you back each node it finds on the way.
With a quick google search I found a performance comparison of XMLReader, LinqToXML and XDocument.Load.
https://web.archive.org/web/20130517114458/http://www.nearinfinity.com/blogs/joe_ferner/performance_linq_to_sql_vs.html
I would personally look at using Linq to Xml utilizing the streaming techniques outlined in the Microsoft help file: http://msdn.microsoft.com/en-us/library/system.xml.linq.xstreamingelement.aspx#Y1392
Here's a quick benchmark test reading from a 200mb xml file with a simple filter:
And here's the processing time and memory usage on my machine:
Write a few benchmark tests to establish exactly what the situation is for you, and take it from there... Linq2XML introduces a lot of flexibility...