Re: 1000 records FlatFile takes over 5 minutes before being publis



Hi Chuck,

Thanks for the response.

I need to debatch 1 file of 1000 records into 1000 seperate records.
This all works fine, it is just taking too much time to complete.

It seems that the disassembling is being done in chunks. I even played with
the Large Message Fragment Size but this had no influence - I am using the
default value of 100K. I am still seeing the spool size increasing step by
step and the time between two increases is too much and seems to be happening
at a regular interval.

On the SQL side I see Lock timeouts/sec > 3000 around the same time I see an
increase in the Spool Size. ??

Could you explain what the Messaging Agent is doing during the disassembling.
It looks strange that parsing a file of 800K (1000 records) takes so long...

This is crucial for us, since we want to test files with up to 300000
records...
So I need to know what is slowing down things.

We use 1 BizTalk application server with 1 remote SQL server when doing our
tests.
All SQL jobs are enabled and running successfully.
CPU and Memory consumption on both machines are low...
The file receive location is hosted in a seperate host.

It is the first time that I am experiencing this kind of behavior :)

I could send you the performance chart of the application and sql server...


"Chuck" wrote:

On Feb 6, 3:11 pm, Koen <K...@xxxxxxxxxxxxxxxxxxxxxxxxx> wrote:
Hi,

I am investigating a FlatFile reception that takes too long on a server.
1000 records takes between 5 and 10 minutes before it is published to the
message box.

I monitored all relevant counters and the strange thing I noticed is that
the records are published in a rate of 145 records to the spool. The graph
just looks like a stair and it takes 7-8 steps before the spool size reaches
1000.

Could anybody explain me how this comes and what could be the reason why it
takes such a long time between two bursts of 145 record.

The host has no throttling. Even if rate throttling is not active I have the
same behavior.
There must be something that is slowing things down.
Even if I look at the SQL counters on the remote SQL server, nothing looks
abnormal.

On other machines it performs better.

Thanks for any advice.

Is this a single flatfile containing 1000 records, or 1000 individual
flat files, each with one record. Possible answers will vary based on
this .

Chuck

.



Relevant Pages

  • Re: 1000 records FlatFile takes over 5 minutes before being publis
    ... I need to debatch 1 file of 1000 records into 1000 seperate records. ... I am still seeing the spool size increasing step by ... We use 1 BizTalk application server with 1 remote SQL server when doing our ...
    (microsoft.public.biztalk.general)
  • Re: 1000 records FlatFile takes over 5 minutes before being publis
    ... I am still seeing the spool size increasing step by ... We use 1 BizTalk application server with 1 remote SQL server when doing our ...
    (microsoft.public.biztalk.general)
  • RE: Question with DBI versus PERL
    ... The number of SQL statements is not significant with Perl/DBI. ... All spool statements will turn to Perl printcalls. ... Read the Perl docs to get familiar with the syntax of the language. ... I want to know how can i run this sql script with DBI. ...
    (perl.dbi.users)
  • Re: Running SQLPlus from command line
    ... Startup and display SQL file that is running... ... Spool the output to .txt file... ... you start spooling *after* you run the SQL script. ... You need to turn on spooling, run your SQL statements, and then turn spooling off. ...
    (comp.databases.oracle.misc)
  • Re: Sqlplus formatting error
    ... This is your first session. ... SQL> @w; ... on the same unix box so the unix environment is the same. ... spool test_sqlplus_parms.lst ...
    (comp.databases.oracle.server)