Thursday, March 8, 2012

BULK Insert Errors

I'm having a little trouble with a Bulk Insert. This is my first time
working with this, so I'm not sure what I'm missing here. I've read
everything I can find about Bulk Insert in BOL and from querying the
archive, but I'm not finding the answer.
This is the error I'm getting when attempting to Bulk Insert the data into a
table in a local MSDE DB named GACT:
Server: Msg 4865, Level 16, State 1, Line 1
Could not bulk insert because the maximum number of errors (10) was
exceeded.
Server: Msg 7399, Level 16, State 1, Line 1
OLE DB provider 'STREAM' reported an error. The provider did not give any
information about the error.
OLE DB error trace [OLE/DB Provider 'STREAM' IRowset::GetNextRows returned
0x80004005: The provider did not give any information about the error.].
The statement has been terminated.
This is the Table def:
CREATE TABLE [GACT2] (
[id] [uniqueidentifier] NOT NULL ,
[Prefix] [varchar] (3) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[portcode] [varchar] (6) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[TranType] [varchar] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[Buy_Sell] [varchar] (1) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[SourceCode] [varchar] (4) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[CUSIP] [varchar] (12) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[ProcDate] [varchar] (6) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[TradeDate] [varchar] (6) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[SettlementDate] [varchar] (6) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[TranAmt] [varchar] (20) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[TranAmtSign] [varchar] (1) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[Quantity] [varchar] (20) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[QuantitySign] [varchar] (1) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
[Description] [varchar] (255) COLLATE SQL_Latin1_General_CP1_CI_AS NULL
) ON [PRIMARY]
GO
These are a couple of lines from the Import file:
14|3J6|897000|117236710|B|||20070103|20070103|2007 0104|000000000080000000|+|000000000044800000|+|CIT IGROUP
INC COM SOLICITED ORDER OTC OR NASDAQ EXCH DETAILS ON REQUEST
AVERAGE UNIT PRICE TRANSACTION
15|3J6|897000|117236710|B|||20070103|20070103|2007 0104|000000000080000000|+|000000000044800000|+|CIT IGROUP
INC COM SOLICITED ORDER OTC OR NASDAQ EXCH DETAILS ON REQUEST
AVERAGE UNIT PRICE TRANSACTION
And this is my Bulk Insert Query:
Delete from GACT2
GO
BULK INSERT GACT.dbo.GACT2
FROM 'E:\Import GACT\Files\Exports\NEW\GACT.DML'
WITH
(
FIELDTERMINATOR = '|',
ROWTERMINATOR = '\n',
KEEPNULLS,
KEEPIDENTITY
)
TIA
Matt
Your problem is with the first field - you are trying to push what appears
to be an int value (14) into a unique identifier column. So your data is
incompatible with the table. You have to get valid data or change your
schema or pipe it to a holding table.
Hilary Cotter
Looking for a SQL Server replication book?
http://www.nwsu.com/0974973602.html
Looking for a FAQ on Indexing Services/SQL FTS
http://www.indexserverfaq.com
"Matt Williamson" <ih8spam@.spamsux.org> wrote in message
news:OlnRNS0MHHA.3268@.TK2MSFTNGP04.phx.gbl...
> I'm having a little trouble with a Bulk Insert. This is my first time
> working with this, so I'm not sure what I'm missing here. I've read
> everything I can find about Bulk Insert in BOL and from querying the
> archive, but I'm not finding the answer.
> This is the error I'm getting when attempting to Bulk Insert the data into
> a table in a local MSDE DB named GACT:
> Server: Msg 4865, Level 16, State 1, Line 1
> Could not bulk insert because the maximum number of errors (10) was
> exceeded.
> Server: Msg 7399, Level 16, State 1, Line 1
> OLE DB provider 'STREAM' reported an error. The provider did not give any
> information about the error.
> OLE DB error trace [OLE/DB Provider 'STREAM' IRowset::GetNextRows returned
> 0x80004005: The provider did not give any information about the error.].
> The statement has been terminated.
> This is the Table def:
> CREATE TABLE [GACT2] (
> [id] [uniqueidentifier] NOT NULL ,
> [Prefix] [varchar] (3) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
> [portcode] [varchar] (6) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
> [TranType] [varchar] (10) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
> [Buy_Sell] [varchar] (1) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
> [SourceCode] [varchar] (4) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
> [CUSIP] [varchar] (12) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
> [ProcDate] [varchar] (6) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
> [TradeDate] [varchar] (6) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
> [SettlementDate] [varchar] (6) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
> [TranAmt] [varchar] (20) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
> [TranAmtSign] [varchar] (1) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
> [Quantity] [varchar] (20) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
> [QuantitySign] [varchar] (1) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,
> [Description] [varchar] (255) COLLATE SQL_Latin1_General_CP1_CI_AS NULL
> ) ON [PRIMARY]
> GO
> These are a couple of lines from the Import file:
> 14|3J6|897000|117236710|B|||20070103|20070103|2007 0104|000000000080000000|+|000000000044800000|+|CIT IGROUP
> INC COM SOLICITED ORDER OTC OR NASDAQ EXCH DETAILS ON REQUEST
> AVERAGE UNIT PRICE TRANSACTION
> 15|3J6|897000|117236710|B|||20070103|20070103|2007 0104|000000000080000000|+|000000000044800000|+|CIT IGROUP
> INC COM SOLICITED ORDER OTC OR NASDAQ EXCH DETAILS ON REQUEST
> AVERAGE UNIT PRICE TRANSACTION
> And this is my Bulk Insert Query:
> Delete from GACT2
> GO
> BULK INSERT GACT.dbo.GACT2
> FROM 'E:\Import GACT\Files\Exports\NEW\GACT.DML'
> WITH
> (
> FIELDTERMINATOR = '|',
> ROWTERMINATOR = '\n',
> KEEPNULLS,
> KEEPIDENTITY
> )
> TIA
> Matt
>
|||> Your problem is with the first field - you are trying to push what appears
> to be an int value (14) into a unique identifier column. So your data is
> incompatible with the table. You have to get valid data or change your
> schema or pipe it to a holding table.
That's not an issue. I'm creating the output file myself, so I can remove
that field if I need to. What would be the best way to have a unique ID
column? Do I need to leave it null in my import file and let SQL generate
it? Or should I just change the table def for that column to INT(14)
instead?
TIA
Matt
|||>> Your problem is with the first field - you are trying to push what
> That's not an issue. I'm creating the output file myself, so I can remove
> that field if I need to. What would be the best way to have a unique ID
> column? Do I need to leave it null in my import file and let SQL generate
> it? Or should I just change the table def for that column to INT(14)
> instead?
I did some testing and ended up changing that field in my table to INT and
now I get past that error. Now I'm getting a new error that says
Server: Msg 4863, Level 16, State 1, Line 1
Bulk insert data conversion error (truncation) for row 1, column 15
(Description).
I have the description field defined as [Description] [varchar] (240)
COLLATE SQL_Latin1_General_CP1_CI_AS NULL
The description field in my file is always 240 chars. It CAN'T be any bigger
because that's the number of bytes I'm extracting for that field and writing
to the file. I don't get it. Changing it to [TEXT] helps, but it only
imports up to that field for one row and stops.
|||I found it odd that you're using \n alone as a rowterminator. It *looks* to
me like your data may naturally contain carriage returns and/or line feeds,
unless that was a bad copy & paste, so I'd make sure that you use a proper
delimiter that can't appear in your data.
A
"Matt Williamson" <ih8spam@.spamsux.org> wrote in message
news:OY3ovA2MHHA.992@.TK2MSFTNGP04.phx.gbl...
> I did some testing and ended up changing that field in my table to INT and
> now I get past that error. Now I'm getting a new error that says
> Server: Msg 4863, Level 16, State 1, Line 1
> Bulk insert data conversion error (truncation) for row 1, column 15
> (Description).
> I have the description field defined as [Description] [varchar] (240)
> COLLATE SQL_Latin1_General_CP1_CI_AS NULL
> The description field in my file is always 240 chars. It CAN'T be any
> bigger because that's the number of bytes I'm extracting for that field
> and writing to the file. I don't get it. Changing it to [TEXT] helps, but
> it only imports up to that field for one row and stops.
>
|||>I found it odd that you're using \n alone as a rowterminator. It *looks*
>to me like your data may naturally contain carriage returns and/or line
>feeds, unless that was a bad copy & paste, so I'd make sure that you use a
>proper delimiter that can't appear in your data.
The lines are pretty long, so the copy of it wrapped. Each row of data is on
one line in my file and is terminated with 0D 0A Hex. Is that \n or \r\n?
I've tried both. I also tried adding just a ! at the end of the row and
using "|!" as the terminator, but it didn't change anything either.
|||> >I found it odd that you're using \n alone as a rowterminator. It *looks*
> The lines are pretty long, so the copy of it wrapped. Each row of data is
> on one line in my file and is terminated with 0D 0A Hex. Is that \n or
> \r\n? I've tried both. I also tried adding just a ! at the end of the row
> and using "|!" as the terminator, but it didn't change anything either.
Nevermind. It was a standard 1d10t error. I changed the path to where I was
generating the output file in my code, but forgot to change the Bulk Insert
query in QA to reflect that. It works perfectly now.
Thanks for anyone that posted.

No comments:

Post a Comment