I'm working on a Python app I want to be scalable to accommodate about 150 writes per second. That's spread out among about 50 different sources.
Is Mongodb a good candidate for this? I'm split on writing to a database, or just making a log file for each source and parsing them separately.
Any other suggestions for logging a lot of data?
I would say that mongodb very good fit for the logs collection, because of:
So, my opinion is that mongodb perfectly fit for such things as logs. You no need manage a lot of logs files in the file system. Mongodb does this for you.