Producer Consumer model using TPL, Tasks in .net 4

2019-04-17 00:14发布

问题:

I have a fairly large XML file(around 1-2GB).

The requirement is to persist the xml data in to database. Currently this is achieved in 3 steps.

  1. Read the large file with less memory foot print as much as possible
  2. Create entities from the xml-data
  3. Store the data from the created entities in to the database using SqlBulkCopy.

To achieve better performance I want to create a Producer-consumer model where the producer creates a set of entities say a batch of 10K and adds it to a Queue. And the consumer should take the batch of entities from the queue and persist to the database using sqlbulkcopy.

Thanks, Gokul

void Main()
{
    int iCount = 0;
    string fileName = @"C:\Data\CatalogIndex.xml";

    DateTime startTime = DateTime.Now;
    Console.WriteLine("Start Time: {0}", startTime);
    FileInfo fi = new FileInfo(fileName);
    Console.WriteLine("File Size:{0} MB", fi.Length / 1048576.0);

/* I want to change this loop to create a producer consumer pattern here to process the data parallel-ly
*/
     foreach (var element in StreamElements(fileName,"title"))
            {
                iCount++;
            }

            Console.WriteLine("Count: {0}", iCount);
            Console.WriteLine("End Time: {0}, Time Taken:{1}", DateTime.Now, DateTime.Now - startTime);
        }

    private static IEnumerable<XElement> StreamElements(string fileName, string elementName)
    { 
        using (var rdr = XmlReader.Create(fileName))
        {
            rdr.MoveToContent();
            while (!rdr.EOF)
            {
                if ((rdr.NodeType == XmlNodeType.Element) && (rdr.Name == elementName))
                {
                    var e = XElement.ReadFrom(rdr) as XElement;
                    yield return e;
                }
                else
                {
                    rdr.Read();
                }
            }
            rdr.Close();
        }
    }

回答1:

Is this what you are trying to do?

    void Main()
    {
        const int inputCollectionBufferSize = 1024;
        const int bulkInsertBufferCapacity = 100;
        const int bulkInsertConcurrency = 4;

        BlockingCollection<object> inputCollection = new BlockingCollection<object>(inputCollectionBufferSize);

        Task loadTask = Task.Factory.StartNew(() =>
        {
            foreach (object nextItem in ReadAllElements(...))
            {
                // this will potentially block if there are already enough items
                inputCollection.Add(nextItem);
            }

            // mark this collection as done
            inputCollection.CompleteAdding();
        });

        Action parseAction = () =>
        {
            List<object> bulkInsertBuffer = new List<object>(bulkInsertBufferCapacity);

            foreach (object nextItem in inputCollection.GetConsumingEnumerable())
            {
                if (bulkInsertBuffer.Length == bulkInsertBufferCapacity)
                {
                    CommitBuffer(bulkInsertBuffer);
                    bulkInsertBuffer.Clear();
                }

                bulkInsertBuffer.Add(nextItem);
            }
        };

        List<Task> parseTasks = new List<Task>(bulkInsertConcurrency);

        for (int i = 0; i < bulkInsertConcurrency; i++)
        {
            parseTasks.Add(Task.Factory.StartNew(parseAction));
        }

        // wait before exiting
        loadTask.Wait();
        Task.WaitAll(parseTasks.ToArray());
    }