Here's what I'm currently doing:
I have a C# stored procedure, using a context connection, that updates all the records in a table (typically a million records) by repeatedly:
- sequentially reading a group of records (1000 at a time) into a SqlDataAdapter (using a 'select top row_number' -type statement that prevents re-querying processed records);
- performing complex processing on each record and writing the results back to the adapter (several fields of each record are updated);
- updating the table when each group has been processed.
This works, but it's not quite as fast as I'd have hoped for and is only slightly slower if using a non-context connection from an external application. I'd like to enable batch updating to increase performance (I actually get much faster performance from an external application that batch updates using a non-context connection, say 20% faster!)... but of course I can't with SQL CLR.
Any ideas on how I can improve performance of my updating?
Thanks in advance,
Graham
If I understand correctly, you're doing record by record calculation. So all required info is present within the record. If I would try to solve this I would have started with a CLR function, not a SP. So my statement would look something like:
Update BigTable Set CalcField = CLRFunc.ToughCalc(fieldX, fieldY, fieldZ)
I'm not sure if this would be faster, but it would be set based (not all functions will produce set bases solutions)
没有评论:
发表评论