I have a bulk insert inside a try - catch block:
BEGIN TRY
BULK INSERT dbo.EQUIP_STATUS_CODE
FROM 'filepath\filename.csv'
WITH ( MAXERRORS = 1, FIELDTERMINATOR = ',')
END TRY
BEGIN CATCH
EXECUTE dbo.ERROR_LOG_CSV;
END CATCH
I would like to be able to capture the following error when it occurs:
Bulk load data conversion error (truncation)
But it seems that I can't, even though the level is 16 which falls within the try-catch range. I was wondering if there is a way to capture this error when it occurs.
Before I specified the MAXERRORS
to 1 I got this error:
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
Since the former error is much more descriptive to the problem, that is the one I'd like to record.
Though my competence is more Oracle than SQL Server, anyway I'll try to help somehow with this issue. I discovered that your situation is already in the bugtracker of SQL Server (bug id: 592960) with status "Won't fix" since 2010. You can see the corresponding discussion on connect.microsoft.com yourself (on the present moment host is unreachable so I used google cache).
This will probably catch this error because it catches error Msg 4860:
Q: TRY doesn't CATCH error in BULK INSERT
Alexander has given you the answer but you have to read bug log very carefully and consider what might be going on. SQL Server (bug id: 592960)
You are trying to bulk insert directly from a data file to a data table?
From the article, there is a mismatch in data types or truncation. The SQL engine has a bug that does not report this as an error.
Quote from first person reporting the bug - "Inspite of the severity level being 16 I don't see the error being caught by TRY / CATCH construct. The code doesn't break and proceeds smoothly as if no error has occurred."
Have you investigated what fields that may contain bad data?
Here are some suggestions.
1 - COMMA DELIMITED FILES ARE PROBLEMATIC - I always hate comma delimited format since commas can be in the data stream. Try using a character like tilde ~ as the delimiter which occurs less often. Could the problem be that a text field has a comma , in it? Thus adding a field to the data stream?
2 - USE STAGING TABLE - It is sometimes better to import the data from the file into a staging table that is defined with columns as varchar (x). This allows the data to get into a table.
Then write a stored procedure to validate the data in the columns before transferring to the production table. Mark any bad rows as suspect.
Insert the data from the staging table to production leaving behind any bad rows.
Send an email for someone to look at the bad data. If this is a re-occurring data file transfer, you will want to fix it at the source.
3 - REWRITE PROCESS WITH A ETL TOOL - Skip writing this stuff in the Engine. SQL Server Integration Services (SSIS) is a great Extract Translate Load (ETL) tool.
There are options in the connection that you can state that text is quoted "", eliminates the above extra comma issue. You can send rows that fail to import into the production table to a hospital table for review.
In summary, there is a bug in the engine.
However, I would definitely consider changing to a tilde formatted file and/or use a staging table. Better yet, if you have the time, rewrite the process with a SSIS package!
Sincerely
J
PS: I am giving Alexander points since he did find the bug on SQL connect. However, I think the format of the file is the root cause.