Hi, I ran query 'Bulk insert mytable from myFile with (FIRE_TRIGGERS)'.
The problems are,
1.It does nothing if exist duplication key records. There is no option to skip the duplications. What else is possible to skip duplicated insertion and go next record when bulk insert?
2. The triggers are not fired even if bulk insert success.
My goal is that very large data should be loaded at once and use trigger.
The trigger is fired after bulk insert executed.
But, still remained some problems.
The result of executing trigger is affected only the last record of source data file. It seems that the trigger definition has logical or symantical problems. Any one help is very appreciated.
Here is my definition of trigger.
Can I control all of the inserted recorcd in trigger definition?
create table myTable
(
id int,
value1 int,
value2 int,
primary key(id)
);
create table myTempTable
(
id int,
value1 int,
value2 int
);
CREATE TRIGGER mytrigger ON myTempTable
INSTEAD OF INSERT
AS
BEGIN
PRINT 'trigger mytrigger'
DECLARE @.id AS int, @.value1 AS int, @.value2 AS int
SELECT @.id = INSERTED.id,
@.value1 = INSERTED.value1,
@.value2 = INSERTED.value2
FROM INSERTED
PRINT @.id
PRINT @.value1
PRINT @.value2
IF EXISTS(SELECT * FROM myTable WHERE id = @.id)
BEGIN
PRINT 'trigger EXISTS'
PRINT 'UPDATE'
UPDATE myTable SET value = value + @.value,outbKbps =outbKbps + @.outbKbps WHERE id=@.id
END
ELSE
BEGIN
PRINT 'trigger NOT EXISTS'
PRINT 'trigger INSERT'
INSERT INTO myTableVALUES(@.id,@.value1,@.value2)
END
END
GO
bulk insert myTempTable from 'c:\bulk_myTable.dat' with (FIRE_TRIGGERS);
-- bulk_myTable.dat file's contents --
1 100 200
1 100 200
1 100 200
1 100 200
1 100 200
From Books Online, topic: "Using bcp and BULK INSERT"
Triggers are fired once for each batch in the bulk copy operation. The inserted table passed to the trigger contains all of the rows inserted by the batch. Specify FIRE_TRIGGERS only when bulk copying into a table with INSERT and INSTEAD OF triggers that support multiple row inserts.
Your trigger fires only one time for the entire batch -not for each row in the batch.
Your trigger 'assumes' there is, and is designed to handle, only one row in the inserted table, and therefore will NOT do as you desire. It couldn't possibly work for a bulk insert of more than one row.
You should 're-design' the trigger to handle multiple rows at a time. If you provide, in a new posting, the table DDL, some sample data in the form of insert statements, and a statement of what you are attempting to accomplish, perhaps some here can help you create a trigger that will work for you.
Hopefully, this answered your question.
|||I solved the number of 2 from my question in this thread.
I use the 'AFTER' instead of 'INSTEAD OF' trigger for getting the bulk insert result record.
And, in trigger definition, using the cursor, get the recordset that I want to get.
In the end, insert or update to the target table record by record.
I got this as hint from your commant. Thank you very much.
Here is the sample.
CREATE TRIGGER TRG_MY_TABLE ON MyTempTable
AFTER INSERT
AS
BEGIN
DECLARE @.id AS INT,
@.value1 AS INT,
@.value2 AS INT
DECLARE CursorMyTable CURSOR FOR
SELECT id AS cur_id, value1 AS cur_value1, value2 AS cur_value2
FROM MyTempTable
OPEN CursorMyTable
FETCH NEXT FROM CursorMyTable
INTO @.id,@.value1,@.value2
-
IF EXISTS(SELECT * FROM MyTable WHERE id=@.id)
BEGIN
UPDATE MyTable
SET value1= value1+ @.value1,
value2= value2+ @.value2
WHERE id=@.id
END
ELSE
BEGIN
INSERT INTO MyTable VALUES(@.id,@.value1,@.value2)
END
-
WHILE @.@.FETCH_STATUS = 0
BEGIN
FETCH NEXT FROM CursorMyTable
INTO @.id,@.value1,@.value2
IF EXISTS(SELECT * FROM MyTable WHERE id=@.id)
BEGIN
UPDATE MyTable
SET value1= value1+ @.value1,
value2= value2+ @.value2
WHERE id=@.id
END
ELSE
BEGIN
INSERT INTO MyTable VALUES(@.id,@.value1,@.value2)
END
-
END
CLOSE CursorMyTable
DEALLOCATE CursorMyTable
END
I would highly recommend NOT using a CURSOR in a TRIGGER.
It would be far better to load the data into a 'Staging' table, and then process the data as needed. Stepping through a CURSOR in the context of a TRIGGER (meaning during a TRANSACTION) will hold a lot of locks and resources -most likely, unnecessarily.
Just not a good idea.
|||Thank you, but could you more explain about the 'Staging' table?|||A 'Staging' table most likely would have the same columns as the final production table -but datatypes may be different. Data is loaded into the Staging table, and then cleansed, conformed, and moved to the production table.
This is a very common ETL operation.
No comments:
Post a Comment