Is it possible that there is something in your data file
that is causing bcp to think that it has reached an EOF
marker?
Something else you can try is using the -F option to
specify which is the first row to start importing, and
maybe setting this value to 1 million or maybe even 2
million and see what happens.
This should help to eliminate the possibility that there
is something wrong with the data file at that point in
the file where it seems to stop.
I hope that this helps.
Matthew Bando
BandoM@.CSCTechnologies dot com
>--Original Message--
>Hi,
>I am loading a table with 7 million records using bcp
in, however, what i noticed it that the bcp in process
terminates when it reaches around 2.5 million rows.
>I don't feel there should be any such limitation on bcp,
however would like to know if anyone have come across
this problem and how they have worked around to resolve
this problem.
>Thanks.
>.
>I feel this should be the cause ... as the data is huge i will extract it on
ce again with some special row delimeter and retry.
Will update once done.
Thank you all for your inputs.
"Matthew Bando" wrote:
> Is it possible that there is something in your data file
> that is causing bcp to think that it has reached an EOF
> marker?
> Something else you can try is using the -F option to
> specify which is the first row to start importing, and
> maybe setting this value to 1 million or maybe even 2
> million and see what happens.
> This should help to eliminate the possibility that there
> is something wrong with the data file at that point in
> the file where it seems to stop.
> I hope that this helps.
> Matthew Bando
> BandoM@.CSCTechnologies dot com
>
> in, however, what i noticed it that the bcp in process
> terminates when it reaches around 2.5 million rows.
> however would like to know if anyone have come across
> this problem and how they have worked around to resolve
> this problem.
>
没有评论:
发表评论