Ansicht
Dokumentation

03330 - Initial load performance.

03330 - Initial load performance.

SUBST_MERGE_LIST - merge external lists to one complete list with #if... logic for R3up   BAL_S_LOG - Application Log: Log header data  
This documentation is copyright by SAP AG.
SAP E-Book

Initial load performance.

Joan: I can tell you an experience that we had that you might be seeing:

- We started with an empty file, and began processing it. The first time that
it was opened, the SQL Optimizer looked at it and figured that the quickest way
to process records with an empty table is a table scan--it's empty anyway, so
why look at key values by key?

- After running a while, the table had something like 100,000 records in it. We
had really bad performance. We traced it, and the EXPLAIN showed that it should
have been using the primary key. We tracked down the SQLPKG that it was using,
and found that it was still doing a table scan.

Volker has talked about this in the past with a single job: once it is opened,
it continues to use the existing SQLPKG, so if you can run part of the job, then
close the file and reopen it, it should reoptimize and realize that a keyed
access is now more efficient.

In our case, we had gone several weeks, and the SQLPKG had not been reevaluated
even after the system was cycled each weekend. We never figured out why it
wasn't reoptimized, but just "shotgunned" the problem by deleting the SQLPKG's
involved. Performance then was what it should be--around 10ms per access (if I
remember right, it ended up being about 30 times faster than before).

If your table starts being empty each time, you may need to run part of records
with a single job, then stop and restart. If it started empty at one time, but
now always has records in it, you may just have a "bad" (really, "out-of-date")
SQLPKG that would need to be deleted.

Hope it helps!

Jim Doll, Perrigo

>>> joan.altadillZr... 10/04 4:40 AM >>>
Hi All,

This is a performance question: I've a customer that runs SAP
R/3 4.6B ( last patch level ) on an iSeries model 270 ( 2 CPU's,
4GBytes of main storage,... ), I don't know the exact model but it's
rated by IBM with 2000 CPW, aprox. The customer is doing the initial
data load from his legacy system and he says that performance is not
right.

For example, he needs to load about 35000 items in the
material master, and he does with standard batch input tools, we've
mesured an average speed of 8 materials per minute. This implies 3
days to load all materials. Functional consultants in this project
talk about ratios of 30 materials per minute ( comparing with other
implementations in other hardwares, of course ). I don't have
functional experiences to validate this but I think that this
intensive batch jobs are very hardware dependent, of course, the same
load with the 'state of the art' AS/400 will be different.

From a technical view, the system is working properly: very
good paging ratios ( at SAP level and OS/400 level ), and of course a
high CPU load ( 90% to ++++ ) with 3 batch jobs simultaneously (
we've tested the load with just one batch job and load ratios are
practically the same ).

Could someone confirm my opinion ? Any idea to increase
performance ?

Best regards,
Joan Altadill



To unsubscribe from this group, send an email to:
SAP on System i-unsubscribeZegroups.com



Your use of consolut is subject to http://www.consolut.net




Durban Tours - Südafrika Safari

Fill RESBD Structure from EBP Component Structure   PERFORM Short Reference  
This documentation is copyright by SAP AG.

Length: 4057 Date: 20240425 Time: 055632     sap01-206 ( 3 ms )