hi guys,
I have a text file contains about 1 million rows, I tried to import to a
table but I received data conversion errors for some rows, however, the row
number is like 504500 for example, I want to troubleshoot this row but the
only way I know is to open the text file (300MB) which will take forever to
open on notepad, I'm not sure if there is an easy way to troubleshoot such
a problem. Another thing is, I set Batchsize=10000, so I guess if I
received errors for 20 rows, it would mean that 200000 rows (20 *10000)
didn't get inserted into table, correct?
ThanksKevin,
Get a copy of TextPad or one of the many shareware or freeware text editors
that can handle large files with ease.
Another alternative is to import the file into a one-column staging table on
your SQL Server, and then scrub the data before inserting it into the target
table.
Steve Kass
Drew University
kevin wrote:
>hi guys,
> I have a text file contains about 1 million rows, I tried to import to
a
>table but I received data conversion errors for some rows, however, the ro
w
>number is like 504500 for example, I want to troubleshoot this row but the
>only way I know is to open the text file (300MB) which will take forever to
>open on notepad, I'm not sure if there is an easy way to troubleshoot such
>a problem. Another thing is, I set Batchsize=10000, so I guess if I
>received errors for 20 rows, it would mean that 200000 rows (20 *10000)
>didn't get inserted into table, correct?
>Thanks
>
>
>
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment