Hi, hope this is the right place for me to post this, but I couldn't find a
more appropriate group.
In a C# app, I have a large set of structured data that will map to multiple
tables, associated with foreign keys, similar in structure to:
The xxxID columns are Identity columns
TABLE Item (ItemID )
TABLE Exchange ( ExchangeID )
TABLE Batch( BatchID, fkExchange )
TABLE Group( GroupID, fkBatch )
TABLE Transaction ( TransactionID, fkGroup )
TABLE ItemToTransactionVector ( fkItem, fkTransaction )
'Item' rows already exist, and I'm creating the rest in my app to be added
to the database en-mass. Exchanges have a single Batch;Batches have multipl
e
Groups; Groups have multiple Transactions, etc.
For a single table, I would use the ADODB SqlBulkCopy class, unfortunately,
I need to set the foreign keys between the tables.
At the moment, I have an initial implementation that uses individual
inserts, but this is way too slow, even wrapping them inside a transaction.
My thoughts after a little research were:
-Create matching temporary tables, or permanent staging tables for each of
those to be imported.
-Preset the row data with 'fake' foreign keys in my application, to maintain
the relationship between the rows in different tables.
-Process the data somehow from the temporary tables into the real tables
Is this a ridiculous scheme? How else can I insert all that data and
maintain all the relationships - and do it fast?
Thanks! Ross"Ross" <Ross@.discussions.microsoft.com> wrote in message
news:DE44B64C-5790-4D15-A35F-E82ECA15FF9A@.microsoft.com...
> Hi, hope this is the right place for me to post this, but I couldn't find
> a
> more appropriate group.
> In a C# app, I have a large set of structured data that will map to
> multiple
> tables, associated with foreign keys, similar in structure to:
> The xxxID columns are Identity columns
> TABLE Item (ItemID )
> TABLE Exchange ( ExchangeID )
> TABLE Batch( BatchID, fkExchange )
> TABLE Group( GroupID, fkBatch )
> TABLE Transaction ( TransactionID, fkGroup )
> TABLE ItemToTransactionVector ( fkItem, fkTransaction )
> 'Item' rows already exist, and I'm creating the rest in my app to be added
> to the database en-mass. Exchanges have a single Batch;Batches have
> multiple
> Groups; Groups have multiple Transactions, etc.
> For a single table, I would use the ADODB SqlBulkCopy class,
> unfortunately,
> I need to set the foreign keys between the tables.
> At the moment, I have an initial implementation that uses individual
> inserts, but this is way too slow, even wrapping them inside a
> transaction.
> My thoughts after a little research were:
> -Create matching temporary tables, or permanent staging tables for each of
> those to be imported.
> -Preset the row data with 'fake' foreign keys in my application, to
> maintain
> the relationship between the rows in different tables.
> -Process the data somehow from the temporary tables into the real tables
> Is this a ridiculous scheme? How else can I insert all that data and
> maintain all the relationships - and do it fast?
> Thanks! Ross
I can't help you with the ADO bulk load part but here's an example of how
you can load the data into a table variable and then populate the related
tables. Depending on the size of the tables you may prefer to try a temp
table or a permanent one.
/* Sample tables */
CREATE TABLE T1 (x INT IDENTITY PRIMARY KEY, z VARCHAR(10) NOT NULL UNIQUE);
CREATE TABLE T2 (x INT NOT NULL REFERENCES T1 (x), z1 VARCHAR(10) NOT NULL,
PRIMARY KEY (x,z1));
/* temp table variable */
DECLARE @.t TABLE (z VARCHAR(10) NOT NULL, z1 VARCHAR(10) NOT NULL, PRIMARY
KEY (z,z1));
/* Insert the raw data */
INSERT INTO @.t (z,z1)
SELECT 'XX', 'ABC' UNION ALL
SELECT 'XXX', 'DEF' UNION ALL
SELECT 'XXX', 'GHI' ;
/* Populate the parent table */
INSERT INTO T1 (z)
SELECT DISTINCT z
FROM @.t;
/* Join in the foreign key and populate the referencing table */
INSERT INTO T2 (x,z1)
SELECT T1.x, T.z1
FROM @.t AS T
JOIN T1
ON T.z = T1.z;
/* Result */
SELECT * FROM T1;
SELECT * FROM T2;
David Portas, SQL Server MVP
Whenever possible please post enough code to reproduce your problem.
Including CREATE TABLE and INSERT statements usually helps.
State what version of SQL Server you are using and specify the content
of any error messages.
SQL Server Books Online:
http://msdn2.microsoft.com/library/ms130214(en-US,SQL.90).aspx
--|||Thanks for you time David. I'll look at adapting that and see how well it
works for my case.
Regards,
Ross
"David Portas" wrote:
> "Ross" <Ross@.discussions.microsoft.com> wrote in message
...
> I can't help you with the ADO bulk load part but here's an example of how
> you can load the data into a table variable and then populate the related
> tables. Depending on the size of the tables you may prefer to try a temp
> table or a permanent one.
>
...
> --
> David Portas, SQL Server MVP
> Whenever possible please post enough code to reproduce your problem.
> Including CREATE TABLE and INSERT statements usually helps.
> State what version of SQL Server you are using and specify the content
> of any error messages.
> SQL Server Books Online:
> http://msdn2.microsoft.com/library/ms130214(en-US,SQL.90).aspx
> --
>
>
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment