I have a large volume of records I need to insert into a table, but am
running into issues with my transaction log filling up. I want to
create a commit point at every 50-100k records, but don't know how to
do that in SQL Server. I was thinking it has something to do with the
@.@.TRANCOUNT or @.@.ROWCOUNT system variables, but again, not sure how to
use them.
Does anyone have any code or suggestions on an approach?
Many thanks,
DanYou're transaction log will fill up when doing large inserts, regardless of
whether you do it in small batches or all at once.
You could use ALTER DATABASE to put your db into bulked-logged recovery
mode, then do the insert
ALTER DATABASE [YourDb] SET RECOVERY BULK_LOGGED
--do large insert, then set recovery model back to full
ALTER DATABASE [YourDb] SET RECOVERY FULL
"Dan" wrote:
> I have a large volume of records I need to insert into a table, but am
> running into issues with my transaction log filling up. I want to
> create a commit point at every 50-100k records, but don't know how to
> do that in SQL Server. I was thinking it has something to do with the
> @.@.TRANCOUNT or @.@.ROWCOUNT system variables, but again, not sure how to
> use them.
> Does anyone have any code or suggestions on an approach?
> Many thanks,
> Dan
>|||I should also note that the data is already in the database and I'm
trying to perform an insert from a staging/temp table into another
table:
insert into final_table (field1, field2,...fieldN)
select field1, field2,...fieldN
from staging_table
没有评论:
发表评论