Hi,
for testing purposes I'm inserting a flat file into a sql-server table using BULK INSERT unsig the following code:
BULK INSERT rsk_staging
FROM 'c:\temp\bulk\rsk.txt'
WITH (
FIELDTERMINATOR = '\n',
ROWTERMINATOR = '\r\n',
CODEPAGE = 'RAW',
DATAFILETYPE = 'char',
BATCHSIZE = 100000,
ROWS_PER_BATCH = 1925604,
TABLOCK
)
I have two versions of "rsk.txt" one with 1.9mill rows and one with the first 2000 rows only. The files have one column only with 115 characters that I'll split in to several columns later using SUBSTRING. The one with 2000 rows fires in to the database with no problems whatsoever using this exact code, the other one throws the following error:
Server: Msg 4866, Level 17, State 66, Line 1
Bulk Insert fails. Column is too long in the data file for row 1, column 1. Make sure the field terminator and row terminator are specified correctly.
How can I resolve this problem?
EDIT: I tried several different row- and fieldterminators but this exact one works for the small data-file so I assume it should also work for the large one...the large one is however copyed directly using binary ftp from a unix-filesystem and the small one is manually copied into a new txt-file using UltraEdit.
Problem solved...when I used ascii-ftp instead of binary everything worked well. 1.9 mill records from a flat file on a network share into my local db WITH substring conversion in 2 mins. No need to say I'm a happy camper at the moment
No comments:
Post a Comment