What I want to do is build a dynamic data pull from different SQL source servers (Server1,Server2,Server3) etc.
To pull down to dynamic locations on my SQL server (Dev,Prod) into databases (database1,database2,etc)
The tables will be dropped and recreated each time the package is run so that I am sure I match the source servers if they change anything on source (field names, datatypes, lengths, etc)
I will still get the data to extract. I want to pull this down using a single dataflow in a foreach loop
.
I have a table that has all the server names and tables and databases in it and
I want to loop through that table and pull all the rows of tables inside down to my server (server1.database1.table_x
,server5.database3.table_y
,etc) So that I don't have to build a new data flow for each table.
In order to do this I have already built the foreach loop with a sql task that is dumping results into an object. Then the foreach loop takes that object that has 7 different fields (Source_Server_Name
,Source_Server_Type_Driver
,Source_Database
,Source_Table
,Source_Where_Clause
,Source_Connection_String
,then destination stuff) and it puts each of those fields into a different String variable for use inside the loop.
I can change the Connections dynamically using the variables but I can't figure out how to get the column mapping in the dataflow to function,
Is there some kind of script task I can use to edit the backend XML that will create the column mapping for me so the metadata does not error out? Any help would be greatly appreciated :-)
This is the best illustrated example I could find of what I am doing just remember I need to have a different metadata setup for each table I pull down to my server.
http://sql-bi-dev.blogspot.com/2010/07/dynamic-database-connection-using-ssis.html