Trying to import data into Azure. Created a text file in Management Studio 2005. I have tried both a comma and tab delimited text file.
BCP IN -c -t, -r\n -U -S -P I get the error {SQL Server Native Client 11.0]Unexpected EOF encountered in BCP data file
Here is the script I used to create the file:
SELECT top 10 [Id]
,[RecordId]
,[PracticeId]
,[MonthEndId]
,ISNULL(CAST(InvoiceItemId AS VARCHAR(50)),'') AS InvoiceItemId
,[Date]
,[Number]
,[RecordTypeId]
,[LedgerTypeId]
,[TargetLedgerTypeId]
,ISNULL(CAST(Tax1Id as varchar(50)),'')AS Tax1Id
,[Tax1Exempt]
,[Tax1Total]
,[Tax1Exemption]
,ISNULL(CAST([Tax2Id] AS VARCHAR(50)),'') AS Tax2Id
,[Tax2Exempt]
,[Tax2Total]
,[Tax2Exemption]
,[TotalTaxable]
,[TotalTax]
,[TotalWithTax]
,[Unassigned]
,ISNULL(CAST([ReversingTypeId] AS VARCHAR(50)),'') AS ReversingTypeId
,[IncludeAccrualDoctor]
,12 AS InstanceId
FROM <table>
Here is the table it is inserted into
CREATE TABLE [WS].[ARFinancialRecord](
[Id] [uniqueidentifier] NOT NULL,
[RecordId] [uniqueidentifier] NOT NULL,
[PracticeId] [uniqueidentifier] NOT NULL,
[MonthEndId] [uniqueidentifier] NOT NULL,
[InvoiceItemId] [uniqueidentifier] NULL,
[Date] [smalldatetime] NOT NULL,
[Number] [varchar](17) NOT NULL,
[RecordTypeId] [tinyint] NOT NULL,
[LedgerTypeId] [tinyint] NOT NULL,
[TargetLedgerTypeId] [tinyint] NOT NULL,
[Tax1Id] [uniqueidentifier] NULL,
[Tax1Exempt] [bit] NOT NULL,
[Tax1Total] [decimal](30, 8) NOT NULL,
[Tax1Exemption] [decimal](30, 8) NOT NULL,
[Tax2Id] [uniqueidentifier] NULL,
[Tax2Exempt] [bit] NOT NULL,
[Tax2Total] [decimal](30, 8) NOT NULL,
[Tax2Exemption] [decimal](30, 8) NOT NULL,
[TotalTaxable] [decimal](30, 8) NOT NULL,
[TotalTax] [decimal](30, 8) NOT NULL,
[TotalWithTax] [decimal](30, 8) NOT NULL,
[Unassigned] [decimal](30, 8) NOT NULL,
[ReversingTypeId] [tinyint] NULL,
[IncludeAccrualDoctor] [bit] NOT NULL,
[InstanceId] [tinyint] NOT NULL,
CONSTRAINT [PK_ARFinancialRecord] PRIMARY KEY CLUSTERED
(
[Id] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON)
)
There are actually several hundred thousand actual records and I have done this from a different server, the only difference being the version of management studio.
I think most of us prefer real-world examples than syntax hints, so here's what I did:
bcp LoadDB.dbo.test in C:\temp\test.txt -S 123.66.108.207 -U testuser -P testpass -c -r /r
My data was an extract from a Unix-based Oracle DB which was tab delimited and had an LF end of line character.
Because my data was tab delimited I did not specify a -t parameter, the bcp default is tab.
Because my row terminator was a LineFeed (LF) character, then I used -r /r
Because my data was all being loaded into char fields I used the -c parameter
I will share my experience with this issue. My users were sending me UTF-8 encoding and everything was working fine. My load started to fail when they updated the encoding to Encode in UCS-2 LE BOM. Use notepad++ to check these setting.
Reverting back to UTF-8 fixed my problem.
This link helped me resolving my issue.
I every case that I have encountered this error, it ends up being an issue where the number of columns in the table do not the match the number of columns delimited in the text file. The easy way to confirm this is to load the text file into excel and compare the column count to that of the table.
Open the CSV file in EXCEL and "save as" new CSV file
Just an FYI that I encountered this same exact error and it turned out that my destination table contained one extra column than the DAT file!
If the file is tab-delimited then the command line flag for the column separator should be
-t\t
-t,