The next step in SAP BW after Initial (INIT) and Full of  Logistic (LIS) Data Extaction (with InfoPackage) from SAP R/3 to Persistance Storage Area (PSA), is to load very large data (2LIS_11_VAKON) from PSA to DataSource Object (DSO). This data source is one of the biggest Datasource we have on our SAP BW 7 system.

Seconds after we start the extraction  from PSA to DSO using Data Transfer Process (DTP), it suddenly throw an ABAP dump on first Package.

This a log file from ST22 ABAP dump Analysis:

Problem Analysis:

So, after digging on search engine and SAP Notes. The problem or challenge in this ABAP dump analysis is that the process has lack of storage, lack of PSATEMP storage. PSAPTEMP is an ORACLE temporary tablespace using by the SAP BW. Like temp table on Microsoft SQL Server.

You can use ST14 to perform a Business Warehouse Application analysis, finding the large(largest) table so you can determine the suitable(minimum) size for PSAPTEMP. eventhough the exact size might be hard to figure, at least we can figure out the minimum storage size.

DTP - Semantic Groups

The process that took the significant ammount of storage is caused by ‘Semantic Groups’ technique. The ‘Semantic Groups’ feature will first gather all the information based on primary key / groups primary key to sort all the records across the packet.

Option for solution, workaround:

1.Extend  PSAPTEMP
2.Turn Off the ‘Semantic Groups’ feature in DTP. This will do the work also without extend the PSAPTEMP.
The consequence of turning off the ‘Semantic Groups’ is that you might be end up with fixing a lot of unqualified data despite those records have same key(primer key) or group key.

2 Comments SAP BW: DTP, Extract Very Large Data from PSA to DSO

Leave A Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.