As a preface, I am brand new to using SQL Server 2005; I know how to use the SELECT, UPDATE, DELETE, and INSERT commands and that's about it. I am also using Express Edition on my local PC (E8400 processor, 8GB of DDR2-800, 2 x 640GB SATA-II HDD in RAID 1)
I have a table that I set up with 8 columns, all are NVARCHAR(Max) and I allow Null. I know in concept what a primary key is, but I don't have one (nor do I know how to set one up).
My VB.NET program that I'm working on is downloading a historical stock price chart from Yahoo, for every single ticker symbol in existence. The first 50,000 rows or so I added was super fast. Then I went to bed, and when I woke up it was still running - but the rate of row additions has slowed waaaaaay down; I noticed this around row 300,000. I always expected the rate of row addition to be constant over time, but obviously this is not so!
From browsing other Stack Overflow questions, I suspect my slow down is related to my piss-poor table setup. If this is the case, where should I begin first to fix this, and are there any good resources I could read up on to get started? I'm hoping this is something simple I can fix :)
In case it matters, this is how I'm adding rows:
cmdtext = "IF NOT EXISTS(SELECT DateStamp FROM DailyPrice WHERE (DateStamp = '" + datestamp + "' AND Ticker = '" + ticker + "')) INSERT INTO DailyPrice (Ticker,OpenPrice,ClosePrice,HighPrice,LowPrice,AdjustedClose,Volume,DateStamp) VALUES('" + ticker + "','" + openprice + "','" + closeprice + "','" + highprice + "','" + lowprice + "','" + adjustedclose + "','" + volume + "','" + datestamp + "')"
cmd = New SqlCommand(cmdtext, conn)
howmanygotinserted = cmd.ExecuteNonQuery
I iterate through that for every stinking row of the CSV file, which is around 30,000 rows per CSV file (and I have over 5000 of them).
You have at least two problems:
You should have an index on (Ticker,DateStamp) and the check will be much faster. That being said, I wouldn't even do that check. If the INSERT fails, it fails. No biggie.
If you're running inside a transaction and not committing or doing save points then the temporary storage will get huge as you insert rows (because the database needs to be able to rollback any changes). Commit every 1000 rows or so. Either that or don't run inside a transaction.
Now, the next issue is how you're constructing the INSERT statement. You don't want to use string concatenation with parameters. It's a bad practice to get into (in Web applications this is a huge cause of SQL injection vulnerabilities). Take a look at Insert command with parameters and use something like:
Lastly, you should add a primary key. You could use (Ticker,DateStammp) for this but I personally favour technical primary keys, meaning a primary key that has no external meaning. An auto-increment integer field is the most common example of this. Adding (Ticker,DateSTamp) as a primary key will add the index I was referring to above. It's the index that really makes the difference.
In terms of database problems:
DateStamp
andTicker
, you could add a unique index on those columns. That would force the RDBMS to check for an existing record before inserting anything, and it will probably be a lot faster than your current version, because it will use an index and is optimized for this.NVARCHAR(MAX)
. Find out the appropriate data type for every column and reduce the max length ofvarchar
columns to about what you actually expect to be in the database. This should make SELECTs and the built-in index management work faster.'
character which would blow your SQL query.However, I would also double check your CSV reader code. What if the real problem is not even the database but your application needing too much memory and causing swapping?
As you already discovered, it is not - and it might even get "worse" if you apply a lot of indexes to that table, since those kinda slow down INSERT statements.
In addition to what others have said, you also have a problem in that you are inserting rows one at a time. Far better to use bulk insert or an SSIS package. Never insert (or update or delete) large amounts of data by iterating through the rows. Databases are designed to work wioth sets of data and perform far better when doing so.
The first thing you should do is to create an index on your table. You're causing a complete table scan every time you want to add a new row (because of the
SELECT DateStamp FROM DailyPrice ... WHERE ...
statement), looking for some existing value.Anyway, since you're checking for a record with an specific DateStamp and Ticker value for every insert, you could make DateStamp and Ticker the primary key of your table, that way, the database will index these two columns, and you won't need to do the
IF NOT EXISTS
part... the insert will fail if there is already a given key in the table. Keep in mind, however, that this will cause an exception in your VB.Net program, so you'll have to handle it.Another thing you can do is to change the datatypes for the columns, specially the ones that are indexed. You could change Ticker to
nvarchar(X)
with X being some fixed value (nvarchar(250
), for example), if you can estimate the length of the strings to be stored there. Also you could change DateStamp toDateTime
.There's your first problem. Databases work best if you tell them what type of data you have, and choose the smallest datatype that works for your data. NVARCHAR(Max) is about the most inefficient choice you could have made.
And that's your second problem. On every insert you are checking to see if you already have inserted a row that has the same values as another row for certain columns. Because you haven't told the database to index those columns, it has to check the entire table each time, so your query gets slower and slower as the table grows. To add a primary key to an existing table you can use this:
See here for more info.