2012年2月18日星期六

bcp - Transaction LogFile Size issue

Hi Experts,
I am having new issue again,
My task is to copy data from one table to another table residing in
different database.
Table happenes to be extreamly large. (Contains around 15 million
rows.)
I tried several ways (SQL query, SSIS packages,etc...)
I found BCP utility suits my requirement. So planned for BCP.
I am trying following ways (Two steps)
1. bcp <MyFirstDB.TableName> out <MyFlatFilePath> -n -T
2.bcp <MySecondDB.TableName> in <MyFlatFilePath> -n -T
(Basically, Copying data from source to flat file and from there to
DestinationTable.)
Here problem is, My transaction Log file (MySecondDB_log.ldf) grows
like hell on second command. It grows upto 8 GB (max. free space That
I have on disk ).
My datatransfer will be incomplete because of no space on HardDisk.
PLease let me know, where I am going wrong,if you have better method,
how can i optimize my data transfer. (My log file grows by 2 MB, not
with %ge)
Thank you in advance,
Sriharsha Karagodu.
Are you changing the second database's recovery to bulk?
On Mar 17, 9:56Xam, sriharsha.karag...@.gmail.com wrote:
> Hi Experts,
> I am having new issue again,
> My task is to copy data from one table to another table residing in
> different database.
> Table happenes to be extreamly large. (Contains around 15 million
> rows.)
> I tried several ways (SQL query, SSIS packages,etc...)
> I found BCP utility suits my requirement. So planned for BCP.
> I am trying following ways (Two steps)
> 1. bcp X<MyFirstDB.TableName> out <MyFlatFilePath> -n -T
> 2.bcp X<MySecondDB.TableName> in <MyFlatFilePath> -n -T
> (Basically, Copying data from source to flat file and from there to
> DestinationTable.)
> Here problem is, My transaction Log file (MySecondDB_log.ldf) grows
> like hell on second command. It grows upto 8 GB (max. free space That
> I have on disk ).
> My datatransfer will be incomplete because of no space on HardDisk.
> PLease let me know, where I am going wrong,if you have better method,
> how can i optimize my data transfer. (My log file grows by 2 MB, not
> with %ge)
> Thank you in advance,
> Sriharsha Karagodu.
|||On Mar 17, 6:59Xpm, Sean <ColdFusion...@.gmail.com> wrote:
> Are you changing the second database's recovery to bulk?
> On Mar 17, 9:56Xam, sriharsha.karag...@.gmail.com wrote:
>
>
>
>
>
>
> - Show quoted text -
Actually I read about Changing the Recovery Property. But Where Do I
Get that option?
When I Do Property of Database--> options-->recovery, This will have
three options like,
TonrnPageDetection,
CheckSum,
None.
So, Not sure where will get option to change the recovery to BULK.
please guide me.
|||You can do it visually, but the syntax goes like this:
ALTER DATABASE [database name]
SET RECOVERY [either: FULL | BULK_LOGGED | SIMPLE]
Also, to see what the current recovery model is run: SP_HELPDB
[database name]
1. Run SP_HELPDB [database name], and note the model used.
2. ALTER DATABASE [database name] SET RECOVERY BULK_LOGGED
3. Run SP_HELPDB [database name] to double check the settings
4. Run bcp import
5. ALTER DATABASE [databasename] SET RECOVERY (output of step 2)
6. SP_HELPDB [database name] to double check
|||Shriharsha,
When BCPing in so much data, it is also good to use the batch size operator
to control the transaction size. Such as:
bcp ... -b 50000
This will break up your bcp into about 300 batches, which will speed it up
and gives you more transaction log control. You could then (if necessary)
run extra BACKUP LOGs during the bcp in.
RLF
"Sean" <ColdFusion244@.gmail.com> wrote in message
news:ba587a67-ef18-452e-88a2-d6132234fc5c@.t54g2000hsg.googlegroups.com...
> You can do it visually, but the syntax goes like this:
> ALTER DATABASE [database name]
> SET RECOVERY [either: FULL | BULK_LOGGED | SIMPLE]
> Also, to see what the current recovery model is run: SP_HELPDB
> [database name]
> 1. Run SP_HELPDB [database name], and note the model used.
> 2. ALTER DATABASE [database name] SET RECOVERY BULK_LOGGED
> 3. Run SP_HELPDB [database name] to double check the settings
> 4. Run bcp import
> 5. ALTER DATABASE [databasename] SET RECOVERY (output of step 2)
> 6. SP_HELPDB [database name] to double check
>
|||Thanks Rusell and Sean,
I have changed the model to "BULK_LOGGED" and used -b attribute in bcp import.
Smaller the batch size, faster is the data transfer
"Russell Fields" wrote:

> Shriharsha,
> When BCPing in so much data, it is also good to use the batch size operator
> to control the transaction size. Such as:
> bcp ... -b 50000
> This will break up your bcp into about 300 batches, which will speed it up
> and gives you more transaction log control. You could then (if necessary)
> run extra BACKUP LOGs during the bcp in.
> RLF
> "Sean" <ColdFusion244@.gmail.com> wrote in message
> news:ba587a67-ef18-452e-88a2-d6132234fc5c@.t54g2000hsg.googlegroups.com...
>
>

没有评论:

发表评论