Thursday, March 8, 2012

BULK INSERT flat file with only one column

Hi,

I have a text file with a single column that i need to bulk insert into a table with 2 colums - an ID (with identity turned on) and col2

my text file looks like:

row1
row2
row3
...
row10

so my bulk insert i have like this:
BULK INSERT test FROM 'd:\testBig.txt'
WITH (
DATAFILETYPE = 'char',
FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)

but i get the error:

Server: Msg 4866, Level 17, State 66, Line 1
Bulk Insert fails. Column is too long in the data file for row 1, column 1. Make sure the field terminator and row terminator are specified correctly.

However, as you can see from the text file, there is only one column, so i dont have any field terminators.

Any ideas how to make this work?

Thanks.You'll probably need to use a format file to be able to skip columns.|||i dont think i want to skip any columns ??

the above method works if i have say 2 columns in my text file, and 3 columns in my table, since the first column in the table is an identity column.

my problem is that the bulk insert doesnt seem to be able to handle a text file with just one field per row. it seems to be looking for a field delimiter, when in fact there is only one field and then its the row delimiter etc etc.|||Create a table with one column which is used to store data from your text file and then do BULK INSERT.

BULK INSERT test FROM 'd:\testBig.txt'
WITH (
DATAFILETYPE = 'char',
--FIELDTERMINATOR = ',',
ROWTERMINATOR = '\n'
)

At last, you can add a column for the ID.|||Of course you do, you want to skip columns in the target table by not populating them (I am talking about your identity field.) With "FORMATFILE = 'your_format_file' " you can define exactly what you want or don't want to populate.

No comments:

Post a Comment