Howdy, all. We have a (log-type) file that's constantly being written
to. We need to grab the latest rows out of this file, insert them into
a table, and then process them. We've found out the hard way that BCP
locks the file while it's inserting, so rows can't be written to the
file while the BCP is running.
Our current workaround is to make a copy of the file (using ftp), but
we're running into other problems.
I'm trying to find a better way to do this. We've narrowed down a
couple.
1) use TAIL or something similar to grab the records since we last ran.
2) use BULK INSERT, provided it doesn't lock the file.
3) get suggestions off of USENET.
Any suggestions or comments?
Thanks,
MichaelWHy not BCP into a "Different" table and then run a Stored Proc to Process
the records into the main table from the One you BCP'd into?
1) Step 2, where any locks would be created on the main table, will be
faster.
2) If there's still a concurrency issues, you can rewrite the Step 2 code to
process the records from one table into the main table in smaller batches...
reducint the batch size until the concurrency issue goes away.
"M Bourgon" wrote:
> Howdy, all. We have a (log-type) file that's constantly being written
> to. We need to grab the latest rows out of this file, insert them into
> a table, and then process them. We've found out the hard way that BCP
> locks the file while it's inserting, so rows can't be written to the
> file while the BCP is running.
> Our current workaround is to make a copy of the file (using ftp), but
> we're running into other problems.
> I'm trying to find a better way to do this. We've narrowed down a
> couple.
> 1) use TAIL or something similar to grab the records since we last ran.
> 2) use BULK INSERT, provided it doesn't lock the file.
> 3) get suggestions off of USENET.
>
> Any suggestions or comments?
> Thanks,
> Michael
>|||"M Bourgon" <bourgon@.gmail.com> wrote in message
news:1110928274.048339.317490@.f14g2000cwb.googlegroups.com...
> Howdy, all. We have a (log-type) file that's constantly being written
> to. We need to grab the latest rows out of this file, insert them into
> a table, and then process them. We've found out the hard way that BCP
> locks the file while it's inserting, so rows can't be written to the
> file while the BCP is running.
> Our current workaround is to make a copy of the file (using ftp), but
> we're running into other problems.
> I'm trying to find a better way to do this. We've narrowed down a
> couple.
> 1) use TAIL or something similar to grab the records since we last ran.
> 2) use BULK INSERT, provided it doesn't lock the file.
> 3) get suggestions off of USENET.
>
> Any suggestions or comments?
> Thanks,
> Michael
>
Would a half baked suggestion do?
Write an ActiveX Script to open the file shared read only ,
find your starting point and loop to the end doing either
1. a write out of the lines as is to a scratch file that you can then
BulkCopy in
2. a parse of the log file lines and then update the DB directly.|||The problem is that the file's still locked during that time, and I
can't afford to lose any rows.|||Not a bad idea, though I may just use tail for simplicity, assuming it
doesn't lock the file as well. Thanks|||Answering my own question - yes, bulk insert DOES lock the file -
couldn't append to the file, got "The process cannot access the file
because it is being used by another process.". Looks like I'll
tail/etc the file. Thanks, guys.
Saturday, February 25, 2012
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment