I have to update a big table with over 270 update fields.
I am relative new to .NET and need advise what is better to use in this case: SqlCommand, some kind of memory-mapped table or DataSet or maybe it exists some kind of auto-generated objects using meta-data from DB? Please help.
Reason: I have an old big Delphi7 application, a part of which is responsible for listening on socket some packets which are marshaled to big structures and, in final step, stored in DB. Now I am porting this part to new C# Service and at least actually i have to preserve the same logic. The problem is that structure is BIG (over 220 fields) and tables where it is stored have near 300 fields. From my structure of 220 fields are deducted/calculated other ~50 fields and all should be update in DB. Actual Delphi code is ugly ant it grew over several years like table itself, something like this:
'UPDATE TABLE_NAME ' +
' MSG_TYPE = ' + IntToStr(integer(RecvSruct.MSG_TYPE)) + ' ' +
' ,SomeFLOATfield = ' + FloatToStr(RecvSruct.SomeFLOATfield) + ' ' +
... //and other over 270 fileds here
'WHERE ID = ' + IntToStr(obj.ID)
no any dynamic SQL, etc.. Actually I cannot change DB structure.. so that i have to play in code only and I am not sure if it is pertinently to translate code. Table is used for some reports and statistics. Some of calculated/deducted fields have to deal with some constants in source-code.
used dev-tools: MS SQL Server 2000, C# .net2.0, VS2008
No special rabbit to pull out of the .net hat for this one I'm afraid.
Short of the complexities around "knowing" only some completely independant fields have changed and constructing an update statement only for them , you are stuffed.
Even knowing this would have been better stored as blob doesn't really help you. Likely not to be true in any case.
A parameterised query or stored procedure would look a little neater in the code but that could have been done in delphi anyway.
No way to say from here how it should be done, but one idea that may have mileage is hiding the current table from everything but a small set of functions.
For instance if you were to rename it and then create a view with the current name. None of the stuff reading and (possibly a good bit of code that writes to it) would notice. If you can get to the point where the raw table is only accessed via the view and some stored procedures, then you can start hacking away the structure.
A code (no sql) only would be to insert an ORM style access in between the application(s) and the table. That's a decision that should be based on your skill set and the mix of applications more than anything else.
Unless you can and are prepared to decouple all the applications from the specific implemention that is this table, you are simply polishing faeces. No point in spending valuable resources on that.
The simplest solution applies here because the way ole db works is with strings. So, to pass 270, 500, 1000 parameters, all I do is pass a single string, a string containing 270 parameters is probably well under 2kb... which in modern computing... carry over the 1... doesn't have a performance penalty. There's an xml solution here, but that's just apples and oranges, you're still passing the string, it would require extra code to handle the xml however. So... your architecture should look like:
Stored procedure on SQL server with 270 input parameters:
A command object with 270 parameters:
Remember, you're still doing a pretty intensive operation, but your benchmark should be the old application. If it's a little bit worse, I wouldn't worry about it, since the framework requires more computing overhead.
I have no idea why it won't format the code...
Using Simple.data could simplify your code and logic (though it requires .NET 4.0)
Ok. Since you can add a new stored procedure, I would suggest packaging up all the values, and shipping it down as XML to your stored procedure.
You can find a kinda example here: http://granadacoder.wordpress.com/2009/01/27/bulk-insert-example-using-an-idatareader-to-strong-dataset-to-sql-server-xml/
The good news, that example I have is older and coded to Sql Server 2000 (with OPENXML).
..
This would be better than sending down 300 parameters to a stored procedure, IMHO.
The other advantage, if you have more than 1 row of data, you can ship that down as well.
......
The "gist" of it:
First, you can get the 2000 "pubs" database here:
http://www.microsoft.com/en-us/download/details.aspx?id=23654
Now add this stored procedure:
/* USP */
/* Example usage */
You could split the tables into new tables, then create views with the same name as the old tables which joins, switches, castings and etc., to convert the new tables into the old structures for the reports.
If you do use commands (like in the Delphi code you posted), use parameters to prevent SQL injection.
With the current DB structure you have using an out of the box ORM could be tiring, since you have loads of columns to map. You could create POCO classes as a type safe model, then use data notations or custom attributes to make the mapping simpler, then create SQL commands on the fly from the attributes.