I am trying to learn BULK INSERT/Format File on a simple txt import file and a simple target SQL Table. I keep getting a Collation Error and cannot figure out what I am doing wrong to cause this.
SQL Table
Activity_C Int
Act_Date SmallDateTime
Act_Unique Char(6)
Format File:
8.0
3
1 SQLCHAR 0 1 "," 1 Activity_C ""
2 SQLCHAR 0 10 "," 2 Act_Date ""
3 SQLCHAR 0 6 "\r\n" 3 Act_Unique "SQL_Latin1_General_CP1_CI_AS"
Data Example:
2,1900/01/01,1
2,1900/01/01,2
2,1900/01/01,3On the surface everything looks okay. What is the collation order for your server & table, also can you post your BCP command?
When I need to setup BCP format files I start with BCPing data out and letting BCP create the format file for me. I make any modifications needed.|||Thanks for the reply.
The Collation for the database is SQL_Latin1_General_CP1_CI_AS
The Collation for the table is the same.
I have looked at both of these 1000 times.
I have tried building a format file using BCP by doing both BCP...OUT and BCP...IN. As I begin to modify I get intermittant good results. My biggest problem is that I usually end up with the Collation error.
I have searched on Bulk Insert and Collation errors to the end of the Internet. I am quite sure I was looking at the edge as if I was looking over the grand canyon all to no avail. I see a lot of questions being asked about this problem bu no real concrete answers.
I do know that several different problems can give the same error which is just wrong on MS's part. Misspelled SQL target field names...come on.
I have 75 more tables to go. The table I am mentioning has more fields. I am about to conclude that BULK COPY and BCP are not cooked all the way in SQL Server.|||I would look no further than Books On Line. Ther eis an excelent discussion of this topic.
Yup, Microsoft is notorious for poor error messages, buggy code, and more security holes than you can imagine.
BCP works very well and I have found it to be quite stable. Having said that I have also found that when you go past a simple BCP OUT/IN there is little good documentation for how to do things. Again the best I have found is Books On Line.
I reproduced your error and found the problem
first I created the table and populated it with data...
create table rond9910(Activity_C int, Act_Date smalldatetime, Act_Unique char(6))
insert into rond9910 values(2,'1900/01/01','1')
insert into rond9910 values(2,'1900/01/01','2')
insert into rond9910 values(2,'1900/01/01','3')
next I extracted a bcp format using default info and made some modification to the file...
bcp xxxxxxx.dbo.rond9910 format rond9910.bcp -f rond9910.fmt -S xxxxxxx -T
producing the following .BCP file
8.0
3
1 SQLINT 1 4 "" 1 Activity_C ""
2 SQLDATETIM4 1 4 "" 2 Act_Date ""
3 SQLCHAR 0 6 "" 3 Act_Unique SQL_Latin1_General_CP1_CI_AS
which I changed to
8.0
3
1 SQLCHAR 0 25 "," 1 Activity_C ""
2 SQLCHAR 0 25 "," 2 Act_Date ""
3 SQLCHAR 0 6 "\r\n" 3 Act_Unique SQL_Latin1_General_CP1_CI_AS
Notice that when specifying the collation you ommit the quotes! When I used your .fmt file I got the following error...
Starting copy...
SQLState = S1000, NativeError = 0
Error = [Microsoft][ODBC SQL Server Driver][SQL Server]Collation name ["SQL_Latin1_General_CP1_CI_AS"] could not be resolved for column [Act_Unique] by target server
SQLState = S1000, NativeError = 0
Error = [Microsoft][ODBC SQL Server Driver]Unable to resolve column level collations
hope this helps|||Could you please E Mail me your text data file I want to run it with the format file you showed in your previous post as well as look at it with a hex editor to see if there are any differences to my text file.
I am exporting VFP data from wihin VFP and I am suspecting that my problem is now row terminators. I want to compare.
E Mail: rond@.ixsys.com
Thanks again|||A HEX dump of the data file shows:
322C3139 30302D30 312D3031 2C310D0A 2,1900-01-01,1
322C3139 30302D30 312D3031 2C320D0A 2,1900-01-01,2
322C3139 30302D30 312D3031 2C330D0A 2,1900-01-01,3
Row terminators can be a bugger to work out. The good thing is that the format file can help you sort this out!|||As we say here in Texas "No Wano".
I get the same error with your format file with your text file as well as your frmat file and my text file.
FYI: I just reinstalled SP2 to SKL2k on both the server and my workstation (client).
==========================
I am so tired of this approach...I have been working on it for several days and if it is this hard (several days of attempts) then something is broke. Let me change directions here and ask you this question. A little preface. I am migrating a VFP -> VFP tables app to VFP -> SQL tables app.
The BULK INSERT problems I am dealing with are due to the fact that I am first writing the data migration tool. Which is basically an app that we will use on the weekend prior to go live on monday with the conversion. In other words we run this app and it creates the SQL DB, moves data, creates indexes, defaults etc and during all of this taking care of all of data problems that are there. I.E. right aligned or left space filled key fields. Tables named with SQL reserved words. Fields named with SQL reserved words, dates as text in some tables. ETC. We will also use this tool to build development and staging servers etc to do further development with.
The point is the db, tables data etc are fresh to this sql install at the time we run. The server is dedicated to this app only, so.....
With having 2 36 gig drives on this server, a transaction log file capability of growing to 36 gig (Size of drive + SQL 2000) and only 4 gig of data to move...why don't I just in vfp do the table creation work then scan the fox tables and do inserts to the server...or even an append into a view to the associated SQL table...Then do the Indexing and other SQL work that needs to be done for production. I already have the table creation and indexing part working. I am working on the part that actually moves the data
That is not much different than exporting to server file - roughly equiv to a SQL log write then asking SQL to move that text into SQL table.
After all the work is done shrink the log file back down.
I am not really worried about the time frame being longer and I suspect it will only be an hour or two longer at the most.
========================
Is there something I am missing here...|||I would work on getting things from VFP to your production sql server and consider everything throwaway. Moving data from one SQL Server DB to anther is so simple compared to what you are doing.
IMHO, since you have code to create all the table I would get that done. Once the tables are created I would then work on getting data moved. Use BCP, DTS or VFP, what ever you are most comforatable working with. I once had to migrate a FoxPro database to ADABASE. I was not allowed to created the tables so I gave all the info to our mainframe DBAs. Once the tables were creatted I use FoxPro to push everything. Work very nice. Once you move the data you can run the rest of your app to create indexes and referencial integrity.
BTW, DTS (Import and Export Data from your start bar)has a driver for VFP and the ability to change data on the fly. You can also use DTS to read csv files! You might take 15 min and look it over.|||I would work on getting things from VFP to your production sql server and consider everything throwaway. Moving data from one SQL Server DB to anther is so simple compared to what you are doing.
IMHO, since you have code to create all the table I would get that done. Once the tables are created I would then work on getting data moved. Use BCP, DTS or VFP, what ever you are most comforatable working with. I once had to migrate a FoxPro database to ADABASE. I was not allowed to created the tables so I gave all the info to our mainframe DBAs. Once the tables were creatted I use FoxPro to push everything. Worked very nice. Once you move the data you can run the rest of your app to create indexes and referencial integrity.
BTW, DTS (Import and Export Data from your start bar)has a driver for VFP and the ability to change data on the fly. You can also use DTS to read csv files! You might take 15 min and look it over.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment