可以将文章内容翻译成中文,广告屏蔽插件可能会导致该功能失效(如失效,请关闭广告屏蔽插件后再试):
问题:
I am looking for help to import a .csv
file into SQL Server using BULK INSERT
and I have few basic questions.
Issues:
The CSV file data may have ,
(comma) in between (Ex: description), so how can I make import handling these data?
If the client creates the CSV from Excel then the data that have comma are enclosed within \"\"
(double quotes) [as the below example] so how do the import can handle this?
How do we track if some rows have bad data, which import skips? (does import skips rows that are not importable)
Here is the sample CSV with header:
Name,Class,Subject,ExamDate,Mark,Description
Prabhat,4,Math,2/10/2013,25,Test data for prabhat.
Murari,5,Science,2/11/2013,24,\"Test data for his\'s test, where we can test 2nd ROW, Test.\"
sanjay,4,Science,,25,Test Only.
And SQL statement to import:
BULK INSERT SchoolsTemp
FROM \'C:\\CSVData\\Schools.csv\'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = \',\', --CSV field delimiter
ROWTERMINATOR = \'\\n\', --Use to shift the control to next row
TABLOCK
)
回答1:
Based SQL Server CSV Import
1) The CSV file data may have ,
(comma) in between (Ex:
description), so how can I make import handling these data?
Solution
If you\'re using ,
(comma) as a delimiter, then there is no way to differentiate between a comma as a field terminator and a comma in your data. I would use a different FIELDTERMINATOR
like ||
. Code would look like and this will handle comma and single slash perfectly.
2) If the client create the csv from excel then the data that have
comma are enclosed within \" ... \"
(double quotes) [as the below
example] so how do the import can handle this?
Solution
If you\'re using BULK insert then there is no way to handle double quotes, data will be
inserted with double quotes into rows.
after inserting the data into table you could replace those double quotes with \'
\'.
update table
set columnhavingdoublequotes = replace(columnhavingdoublequotes,\'\"\',\'\')
3) How do we track if some rows have bad data, which import skips?
(does import skips rows that are not importable)?
Solution
To handle rows which aren\'t loaded into table because of invalid data or format, could be
handle using ERRORFILE property, specify the error file name, it will write the rows
having error to error file. code should look like.
BULK INSERT SchoolsTemp
FROM \'C:\\CSVData\\Schools.csv\'
WITH
(
FIRSTROW = 2,
FIELDTERMINATOR = \',\', --CSV field delimiter
ROWTERMINATOR = \'\\n\', --Use to shift the control to next row
ERRORFILE = \'C:\\CSVDATA\\SchoolsErrorRows.csv\',
TABLOCK
)
回答2:
You first need to create a table in your database in which you will be importing the CSV file. After the table is created, follow the steps below.
• Log into your database using SQL Server Management Studio
• Right click on your database and select Tasks -> Import Data...
• Click the Next >
button
• For the Data Source, select Flat File Source
. Then use the Browse button to select the CSV file. Spend some time configuring how you want the data to be imported before clicking on the Next >
button.
• For the Destination, select the correct database provider (e.g. for SQL Server 2012, you can use SQL Server Native Client 11.0). Enter the Server name. Check the Use SQL Server Authentication
radio button. Enter the User name, Password, and Database before clicking on the Next >
button.
• On the Select Source Tables and Views window, you can Edit Mappings before clicking on the Next >
button.
• Check the Run immediately
check box and click on the Next >
button.
• Click on the Finish
button to run the package.
The above was found on this website (I have used it and tested):
回答3:
2) If the client create the csv from excel then the data that have
comma are enclosed within \" ... \" (double quotes) [as the below
example] so how do the import can handle this?
You should use FORMAT = \'CSV\', FIELDQUOTE = \'\"\' options:
BULK INSERT SchoolsTemp
FROM \'C:\\CSVData\\Schools.csv\'
WITH
(
FORMAT = \'CSV\',
FIELDQUOTE = \'\"\',
FIRSTROW = 2,
FIELDTERMINATOR = \',\', --CSV field delimiter
ROWTERMINATOR = \'\\n\', --Use to shift the control to next row
TABLOCK
)
回答4:
The best, quickest and easiest way to resolve the comma in data issue is to use Excel to save a comma separated file after having set Windows\' list separator setting to something other than a comma (such as a pipe). This will then generate a pipe (or whatever) separated file for you that you can then import. This is described here.
回答5:
Import a CSV file into a database by SQL Server Management Studio.
- First, create a table in your database into which you will import the CSV file.
- Log in to your database using SQL Server Management Studio.
- Right click the database and select Tasks -> Import Data...
- Click the Next > button.
- For Data Source, select Flat File Source. Then use the Browse button to select the CSV file. Spend some time configuring the data import before clicking the Next > button.
- For Destination, select the correct database provider (e.g. for SQL Server 2012, you can use SQL Server Native Client 11.0). Enter the Server name; check Use SQL Server Authentication, enter the User name, Password, and Database before clicking the Next > button.
- In the Select Source Tables and Views window, you can Edit Mappings before clicking the Next > button.
- Check Run immediately and click the Next > button.
- Click the Finish button to run the package.
Reference
回答6:
Here\'s how I would solve it:
Just Save your CSV File as a XLS Sheet in excel(By Doing so, you wouldn\'t have to worry about delimitiers. Excel\'s spreadsheet format will be read as a table and imported directly into a SQL Table)
Import the File Using SSIS
Write a Custom Script in the import manager to omit/modify the data you\'re looking for.(Or run a master script to scrutinize the data you\'re looking to remove)
Good Luck.
回答7:
Firs you need to import CSV file into Data Table
Then you can insert bulk rows using SQLBulkCopy
using System;
using System.Data;
using System.Data.SqlClient;
namespace SqlBulkInsertExample
{
class Program
{
static void Main(string[] args)
{
DataTable prodSalesData = new DataTable(\"ProductSalesData\");
// Create Column 1: SaleDate
DataColumn dateColumn = new DataColumn();
dateColumn.DataType = Type.GetType(\"System.DateTime\");
dateColumn.ColumnName = \"SaleDate\";
// Create Column 2: ProductName
DataColumn productNameColumn = new DataColumn();
productNameColumn.ColumnName = \"ProductName\";
// Create Column 3: TotalSales
DataColumn totalSalesColumn = new DataColumn();
totalSalesColumn.DataType = Type.GetType(\"System.Int32\");
totalSalesColumn.ColumnName = \"TotalSales\";
// Add the columns to the ProductSalesData DataTable
prodSalesData.Columns.Add(dateColumn);
prodSalesData.Columns.Add(productNameColumn);
prodSalesData.Columns.Add(totalSalesColumn);
// Let\'s populate the datatable with our stats.
// You can add as many rows as you want here!
// Create a new row
DataRow dailyProductSalesRow = prodSalesData.NewRow();
dailyProductSalesRow[\"SaleDate\"] = DateTime.Now.Date;
dailyProductSalesRow[\"ProductName\"] = \"Nike\";
dailyProductSalesRow[\"TotalSales\"] = 10;
// Add the row to the ProductSalesData DataTable
prodSalesData.Rows.Add(dailyProductSalesRow);
// Copy the DataTable to SQL Server using SqlBulkCopy
using (SqlConnection dbConnection = new SqlConnection(\"Data Source=ProductHost;Initial Catalog=dbProduct;Integrated Security=SSPI;Connection Timeout=60;Min Pool Size=2;Max Pool Size=20;\"))
{
dbConnection.Open();
using (SqlBulkCopy s = new SqlBulkCopy(dbConnection))
{
s.DestinationTableName = prodSalesData.TableName;
foreach (var column in prodSalesData.Columns)
s.ColumnMappings.Add(column.ToString(), column.ToString());
s.WriteToServer(prodSalesData);
}
}
}
}
}
回答8:
Because they do not use the SQL import wizard, the steps would be as follows:
Right click on the database in the option tasks to import data,
Once the wizard is open, we select the type of data to be implied. In this case it would be the
Flat file source
We select the CSV file, you can configure the data type of the tables in the CSV, but it is best to bring it from the CSV.
- Click Next and select in the last option that is
SQL client
Depending on our type of authentication we select it, once this is done, a very important option comes.
- We can define the id of the table in the CSV (it is recommended that the columns of the CSV should be called the same as the fields in the table). In the option Edit Mappings we can see the preview of each table with the column of the spreadsheet, if we want the wizard to insert the id by default we leave the option unchecked.
Enable id insert
(usually not starting from 1), instead if we have a column with the id in the CSV we select the enable id insert, the next step is to end the wizard, we can review the changes here.
On the other hand, in the following window may come alerts, or warnings the ideal is to ignore this, only if they leave error is necessary to pay attention.
This link has images.