2012年2月13日星期一

Batch performance degradation - SQL Server 2000

I am running a VB.net console (batch) application that performs 50,000 sets
of reads, inserts, and updates to numerous SQL Server 2000 database tables.
Each individual set is mutually exclusive from the previous or subsequent
sets. I am monitoring the average processing time per set. When I start
the program, the average processing time is very small. However, the
average processing time slowly grows by about 67% by the end of the 50,000
sets. If I start another group of 50,000 right away, the processing time at
the beginning is again very fast. But, again, the performance degrades as
the program runs.
I've added more RAM to the server, increased the minimum buffer pool size,
and removed all of the constraints from the tables being inserted or
updated. None of these have eliminated nor reduced the degradation. I've
generated traces using Profiler, but have not been able to identify a stored
procedure that takes longer and longer to execute as the program progresses.
Can anyone suggest any other places to investigate?
Thanks,
DanAre you updating all 50K records as a single transaction? If so, you could
be slowing down as the transaction log grows. Try breaking the updates into
smaller logical batches (say 1000 inserts/updates at a time) and issue a co
mmit. This may solve your
problem....
Brad Feaker
Database Administrator

没有评论:

发表评论