My current project is to take information from an OleDbDatabase
and .CSV files and place it all into a larger OleDbDatabase
.
I have currently read in all the information I need from both .CSV files, and the OleDbDatabase
into DataTables
.... Where it is getting hairy is writing all of the information back to another OleDbDatabase.
Right now my current method is to do something like this:
OleDbTransaction myTransaction = null;
try
{
OleDbConnection conn = new OleDbConnection("PROVIDER=Microsoft.Jet.OLEDB.4.0;" +
"Data Source=" + Database);
conn.Open();
OleDbCommand command = conn.CreateCommand();
string strSQL;
command.Transaction = myTransaction;
strSQL = "Insert into TABLE " +
"(FirstName, LastName) values ('" +
FirstName + "', '" + LastName + "')";
command.CommandType = CommandType.Text;
command.CommandText = strSQL;
command.ExecuteNonQuery();
conn.close();
catch (Exception)
{
// IF invalid data is entered, rolls back the database
myTransaction.Rollback();
}
Of course, this is very basic and I'm using an SQL command to commit my transactions to a connection. My problem is I could do this, but I have about 200 fields that need inserted over several tables. I'm willing to do the leg work if that's the only way to go. But I feel like there is an easier method. Is there anything in LINQ that could help me out with this?
It sounds like you have many .mdb and .csv that you need to merge into a single .mdb. This answer is running with that assumption, and that you have SQL Server available to you. If you don't, then consider downloading SQL Express.
Use SQL Server to act as the broker between your multiple datasources and your target datastore. Script each datasource as an insert into a SQL Server holding table. When all data is loaded into the holding table, perform a final push into your target Access datastore.
Consider these steps:
CREATE TABLE CsvImport (CustomerID smallint, LastName varchar(40), BirthDate smalldatetime)
CREATE PROC ReadFromCSV @CsvFilePath varchar(1000) AS BULK INSERT CsvImport FROM @CsvFilePath --'c:\some.csv' WITH ( FIELDTERMINATOR = ',', --your own specific terminators should go here ROWTERMINATOR = '\n' ) GO
dir
piped commands can help you create these statements.exec ReadFromCSV 'c:\1.csv
DECLARE @MdbFilePath varchar(1000); SELECT @MdbFilePath = 'C:\MyMdb1.mdb'; EXEC master.dbo.sp_addlinkedserver @server = N'MY_ACCESS_DB_', @srvproduct=N'Access', @provider=N'Microsoft.Jet.OLEDB.4.0', @datasrc=@MdbFilePath
SELECT
the data from SQL Server into Access.EXEC master.dbo.sp_addlinkedserver @server = N'MY_ACCESS_TARGET', @srvproduct=N'Access', @provider=N'Microsoft.Jet.OLEDB.4.0', @datasrc='C:\Target.mdb'
If the column names in the
DataTable
match exactly to the column names in the destination table, then you might be able to use aOleDbCommandBuilder
(Warning: I haven't tested this yet). One area you may run into problems is if the data types of the source data table do not match those of the destination table (e.g if the source column data types are all strings).EDIT I revised my original code in a number of ways. First, I switched to using the Merge method on a DataTable. This allowed me to skip using the
LoadDataRow
in a loop.ADDITION An alternate solution would be to use a framework like FileHelpers to read the CSV file and post it into your database. It does have an
OleDbStorage
DataLink for posting into OleDb sources. See the SqlServerStorage InsertRecord example to see how (in the end substitute OleDbStorage for SqlServerStorage).