Quantcast
Channel: SAP Business Warehouse
Viewing all 151 articles
Browse latest View live

SAP CI/DB split -Issues and Analysis:

$
0
0

SAP CI/DB split - Issues and Analysis:-

 

In SAP Production landscape, we are having CI (Central Instance) including all application servers and DB in a single Host .As HANA Database is there in market the SAP clients are splitting their CI/DB from the host and Keeping CI in one Host (Linux Host) and DB in old host. Later they can move the Database to HANA from IBM DB.

 

 

During this the OS of CI also have to change for eg. AIX to Linux.CI will be moved to Linux and DB2 retained on AIX itself. Here the CI should be moved to Linux because SAP HANA is not going to support in AIX environment.so moving CI to Linux is the first step to install SAP HANA database in future. After moving the DB to HANA they can again keep both CI/DB in new Linux host. Both CI and DB should have Linux OS.

 

Generally after CI/DB split in SAP Landscape we may face many issues regarding communication, Online/offline backups of SAP, performance of BW server etc.

 

Some of the common issues I have listed below and how the BW, BASIS and DB Team will work together .Actually after CI/DB split we may face performance issues on BW.Some time it may takes long time to open the Transaction code in BW.

 

In RSA1 go to modelling, and then click on Info provider after that the system will hang. This will happen for most of the work areas in SAP BW.


 

The first step is to update SAP Basis and DB to check from their end. Both Teams should work together as CI and DB are on different Host.

 

As per BASIS analysis and advice, the DB team can run “Update Statics” on BW. They will run Reorg and Runstats on BW .it all depends on number of tables present in database and how much data they are carrying.

 

REORG helps DB2 to ensure indexes should become aware of all new data and also no longer include deleted data and collapse all empty page space created by deletion of data and indexes

 

RUNSTATS gathers the updated statistics on the volume and distribution of data within tables and indexes. This information is stored in the system tables and is used to query the data.

 

As a result all tables have been Re-orged and Runstated.

 

The DB team can add Significant Memory to the DB2 buffer pools as much memory as possible. Then BW Team can validate BW server performance degradation. If performance is still slow, they can run an SAP trace to see where the time is being spent.

 

SAP BASIS can restart SAP.

 

Stopping the SAP System: Login into the OS level as a user with SAP administrator authorization (<SID> adm).

 

Enter the command: stopsap [DB|R3|ALL].

 

DB stops the database system: stopsap DB

 

R3 stops the instances & associated processes of the SAP System: stopsap R3

 

ALL stops both the database system & the SAP System. Stopsap ALL

 

Starting the SAP System: Log on in your OS level as a user with SAP administrator authorization (<SID>adm).

 

Enter the command: startsap [DB|R3|ALL].

 

The following applies to this command:

 

DB starts the database system:     startsap DB

 

R3 starts the instances & associated processes of the SAP System:     startsap R3

 

ALL starts both the database system & the SAP System:    startsap ALL

 

The DB team can do the settings so that more memory should be given to both SAP and DB2.

 

If the performance is still down then Basis can analyze the performance issues. They can clear user Buffer from their end. This below screen shows user BUFFER AREA, if we want to reset these area just do the following step

 

TCODE –> SU56

 

Authorization Values –> Reset User Buffer


 

Now BW Team can validate the performance.

 

In Some cases, we can see some issues like the Sequential read time of Transaction codes will be different for different user. Some user’s id are taking less time for Tcode RSA1 and some users Tcode will hang for more time.

 

We tried accessing RSA1 Tcode from other users (DDIC/USER1) it opens quickly. Whereas it is not so for USER2.

Response time log both the user can be compared. We tried accessing RSA1 Tcode from other users (DDIC/USER1) it opens quickly. Whereas it is not so for USER2.

 

USER2:

 

USER1:-

On further Analysis, Now issue looks to be happening for all the user ids. It’s hanging while accessing table RSDVCHA


 

 

Tracing the transaction execution RSA1 is showing hanging on multiple tables.

 

 

We can go for quick restart of application and server as well. If BASIS unable to stop database on BW then DB team can stop it from their end. We can ask that Currently SAP application is down please stop the database. And Once DB is stopped, we can go for Restart the servers DB instance and CI instance.

 

Application has been stopped on server CI instance.

In some scenario if communication is not happening between CI and DB. We can check R3trans from DB servers, if processes are getting hanged.


R3trans Return Codes:-

 

R3trans sets a return code that shows whether or not the transport has succeeded. You can view details on transports in the log file. Here are the following return codes:

0: No errors or problems have occurred.

4: Warnings have occurred but they can be ignored.

8: Transport could not be finished completely. Problems occurred with certain objects.

12: Fatal errors have occurred, such as errors while reading or writing a file or unexpected errors within the database interface, in particular database problems.

16: Situations have occurred that should not have.

 

If it happens, the DB team can dig in to DB and it occurs mainly due to multiple locks in db. Which should be cleared up from DB end and they have to check DB2CLI driver for communication. If possible they can reinstall db2cli driver again to verify that db2cli is working perfect.

 

Accessing DB2 Database using DB2 CLI: -   A DB2 CLI uses a standard set of functions to execute SQL statements and related services at run time. DB2 Call Level Interface is IBM's callable SQL interface to the DB2 family of database servers. It uses function calls to pass dynamic SQL statements as function arguments. DB2 CLI does not require host variables or a pre compiler.

 

Install the DB2 CLI driver on database server.  To do this, we require free space of approximately 50 MB for each operating system of application servers.

 

We should also ensure that the main version of the DB2 CLI driver matches the main version of the DB2 software on the database server. The fix pack level of the DB2 CLI driver must be equal to or lower than the fix pack level of the DB2 software on the database server.

 

After installing again Check whether the SAP system and the SAP programs can then determine the DB2 CLI driver and use it.

 

In a random directory, execute the following command:

 

R3trans –x

 

This call should not return any errors and should report the following:

"R3trans finished (0000)."

 

BWadm 10> R3trans –d

This is R3trans version 6.24 (release 741 - 12.05.14 - 20:14:05).

Unicode enabled version

R3trans finished (0000).

 

By this we can ensure communication is happening between CI and DB.

 

In Another Scenario we can have R3trans is responding fine and application is also connecting fine. But the performance is again slow.

 

This may be due to online or offline backup’s. Tivoli Storage Manager (TSM) provides a powerful and centralized backup, archive, and storage management. Tivoli Storage Manager also plays a role in backup and recovery of DB2 Content Manager Databases in the event of hardware failure. The DBAs and the storage team can work together to get backup issues solved. Sometime the backup job is continually hanging on a TSM resource (Storage device resource) for long periods of time. And it will create numerous locks that can impact the all SAP jobs which will ends up in hanging BW Transactions codes. In this case we have to cancel the backup jobs and check the BW performance.

 

In DB daily online backups will be taken and during the case of Application upgrade and before CI/DB Split the offline backup should be taken .After CI/DB split we should monitor the backup jobs as well whether it is creating any locks.

----------------------------------------------------------------------------------------------------------



Usage of Interrupts in Batch Automation

$
0
0

What is interrupts?

 

Interrupt is a process type which is used in the process chains to trigger the chain after completion of specific steps either in BW any other source system. Interrupts are very helpful in automating the batchprocess in BW in case you don’t have any third party scheduler.

 

Business scenario

 

I am considering a classic example where we have SAP BWextracting data from SAP R3 (ECC) and assuming the batch process chains have to wait until the data pre-calculations and dependent business processes are complete in Source system

 

 

Image1.jpg

 

 

How do we achieve?

 

Step 1: We can find Interrupts in General Services 

Image1.jpg

Step 2: Create a new Process Chain and include the Interrupt Process soon after the Start Process

 

Image1.jpg

   Different Options in the Interrupt Scheduling

 

  • Immediate
  • Date/Time
  • AfterJob
  • After event
  • At operation mode
  • Factory calendar day

  Image1.jpg

 

Step 4: Create Event in BW, Tcode SM64 will help you to create an event in SAP BW and ECC

 

Image1.jpg

 

Step 5: In my scenario I am using an event based trigger so that when an ECC job is complete event is triggered which in turn
triggers the BW process chain

 

So I will make use of this even in the process chain and ECC program

 

Image1.jpg

Step 6: To trigger an event in BW from ECC you will need a program for event raise (We have a custom program available so
I am making use of the same program here)

 

Tcode Se38 >Even raise Program > Maintain variant as Z_TEST which was created earlier

 

Image1.jpg

 

Step 7: Create a Job in ECC using Tcode SM36 and include the program which need to be scheduled followed by an event and
schedule this job as per the business needs

 

Define the background Job

Image1.jpg

Assume Z1 is the ECC program dependent on BW process chain so include Z1

 

Image1.jpg

  Image1.jpg

 

Step 8: If you ECC job is scheduled daily at 8PM then the BW PC should also be scheduled at the same time. Once the ECC jobcomplete it will trigger the event which will in turn trigger the BW event

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

List of Items for Testing - BW

$
0
0

Hi All,

 

Many of you might have alredy have this list of items that need to tested specific to your landscape when we do upgrade, But I thought of posting this blog with a complete list of all BW test items from my project experience.

 

You can make use of this list for projects like BW Upgrade, Service Pack Upgrade, Source System Upgrade, HANA Migration & BEx Upgrade

 

Below test items have been divided into 4 test components Workbench, BEx Queries, BEx Workbooks, BEx Analyzer, Web Templates & Others

 

Component Testing Items

Workbench Tcode - RSA1, RSPC, RSPC1, RSPCM, RSRT, RSANWB, RSCRM_BAPI, RSA2, RSA3, RSA7, RSA6, SM50, SM51, SM66, SE38, SM49, SE37, ST12, SE09, SE03, SM12, RSA1OLD.

Workbench Authorizations & Roles

Workbench Master Data Load - Attribute (Source System)

Workbench Master Data Load - Attribute (Flat File)

Workbench Master Data Load - Text (Source System)

Workbench Master Data Load - Text (Flat File)

Workbench Maintain Master Data

Workbench Maintain Master Data Text

Workbench Hierarchy Data Load - Source System

Workbench Hierarchy Data Load - Flat File

Workbench Transaction Data Load - Source System

Workbench Transaction Data Load - Flat File

Workbench Query Execution in RSRT

Workbench APD Execution - All types (Flat File, Info Cube, Multi Provider, DSO..)

Workbench Open Hub - All Types

Workbench Program Execution - Standard & Custom

Workbench Data Reconciliation

Workbench Batch Run - All variants

Workbench check/create indexes on cube

Workbench check/refresh cube statistics

Workbench Compress Cube

Workbench Delete PSA in process chain

Workbench Maintain cube/Contents to examine data in cube

Workbench Create Info Package

Workbench Create DTP

Workbench Create DSO

Workbench Create Cube

Workbench Create Multiprovider

Workbench Create Trasformation

Workbench Create Data Source

Workbench Create Web Template

Workbench Create Query

Workbench Create Process Chain

Workbench Create cube as copy

Workbench Create InfoObject as Reference

Workbench Capture objects in Trasports

Workbench Add characteristic to Cube

Workbench Add characteristic to ODS

Workbench Change Identification in M/P

Workbench Data modelling - Multiprovider integrity

Workbench Add characteristic to M/Provider

Workbench Change existing InfoObject

Workbench Changing Transfer/Update Rules

Workbench Change existing InfoSet

Workbench Create Aggregate (Not Applicable in case of HANA Migration)

Workbench Modify Aggregate (Not Applicable in case of HANA Migration)

Workbench BIA Index (Not Applicable in case of HANA Migration)

Workbench DSO Activation

Workbench MDX Statements

Workbench Interrupts Execution

Workbench Create a Job

Workbench Schedule a job/Release a Job

Workbench Change the Job Status manually

Others  Performance of the MDX Execution pings/runtimes

Others  Interface with PI/XI/DB Connect/UD COnnect/Flat File/Informatica/EDW/GDW/SSIS for data transfer

Query Designer Execute Query in Query Designer

Query Designer Variable Entry Screen

Query Designer F4 for help Window

Query Designer Variants in Variable Entry

Query Designer Report Output

Query Designer Context Menu

Query Designer Export to Excel

Query Designer Download to PDF

Query Designer Filter options

Query Designer Conditions/Exception

Query Designer Go To Option

Query Designer Drill Down/Across Option

Query Designer Create Bookmark

BEx Analyzer Execute Query in Analyzer

BEx Analyzer Variable Entry Screen

BEx Analyzer F4 for help Window

BEx Analyzer Variants in Variable Entry

BEx Analyzer Report Output

BEx Analyzer Context Menu

BEx Analyzer Export to Excel

BEx Analyzer Download to PDF

BEx Analyzer Filter options

BEx Analyzer Conditions/Exception

BEx Analyzer Go To Option

BEx Analyzer Drill Down/Across Option

BEx Workbook Execute Query in Analyzer

BEx Workbook Variable Entry Screen

BEx Workbook F4 for help Window

BEx Workbook Variants in Variable Entry

BEx Workbook Report Output

BEx Workbook Context Menu

BEx Workbook Export to Excel

BEx Workbook Download to PDF

BEx Workbook Filter options

BEx Workbook Conditions/Exception

BEx Workbook Go To Option

BEx Workbook Drill Down/Across Option

BEx Workbook Marco Addon

Web Templates Execute Query in Query Designer

Web Templates Variable Entry Screen

Web Templates F4 for help Window

Web Templates Variants in Variable Entry

Web Templates Report Output

Web Templates Context Menu

Web Templates Export to Excel

Web Templates Download to PDF

Web Templates Filter options

Web Templates Conditions/Exception

Web Templates Go To Option

Web Templates Drill Down/Across Option

Web Templates Create Bookmark

 

Thanks

Abhishek Shanbhogue

Error on activating a SPOs that was created with template

$
0
0

This blog has been translated with Google Translate. the original blog can be found here: ekessler.de

 

 

In asemanticallypartitionedobject(SPO) wascreatedbasedon a templateobjectcan causean errorduring activation.Figure1.1shows theconstruction of asemanticallypartitionedDSOs. AsatemplateDSOis used.

 

SPO_1_1.jpg

Figure1.1: Construction of an SPOwithtemplate

 

When activating theSPOs, the error occurs "Not possible tocreateexternalSAPHANAviewfor object<SPO>00/ Messageno. RS2HANA_VIEW001".

 

The error messageisnoted thatattemptstogenerateanexternalSAPHANAView foranSPO(or ahybridprovider).

 

SPO_1_2.jpg

 

Figure1.2: Error in activating

 

Thecause oftheerrorisinthis casethetemplatesDSO.InthetemplatesDSOis the hallmarkExternalSAPHANAView forreportingset, seeFigure1.3.

 

SPO_1_3.jpg

 

Figure1.3: Indicator ExternalSAPHANAView forreportingin the TemplateObejkt

 

When creating theSPOsfirst thereference structurefortheSPOis generated. The reference structureofSPOsused as a templatefor each partitionof theSPOs.

 

At installation of thereference structure of theSPOobject<SPO>00the metadata of theoriginalobject arecopied. Themetadataof the template objectalso includesthe indicatorExternalSAPHANAView forreporting.

 

Currentlydoes not supportexternalSAPHANAviews forsemanticallypartitionedobject(and HybridProvider). Therefore, the maintainance option for the indicator ExternalSAPHANAView forreporting is for a semanticallypartitionedobject(see Figure1.4) not available.

 

SPO_1_4.jpg

 

Figure1.4: Properties of the reference structure of theSOPs

 

Wefindthe indicatorHANAMODELFL(ExternalSAPHANAview forBWobject)in the tableRSDODSO(Directory of alldatastores), seeFigure1.5.

 

SPO_1_5.jpg

 

Figure1.5: Indicator ExternalSAPHANAview forBWobject

 

To activate the SPOthe indicatorHANAMODELFLmustbe removedforthereference structure of theSPOs. The reference structurehas the following naming convention <SPO-name>00.Alternatively,you can searchthereference structureonthe name of theSPOsin the tableRSDODSO (for cubes see table RSDCUBE). Figure1.6 showsbothvariants.

 

SPO_1_6.jpg

 

Figure1.6: Searchreference structure of theSPOintheRSDODSO

 

remove indicatorHANAMODELFL(SAP HANAView), seeFigure1.7 and save.

 

SPO_1_7.jpg

 

Figure1.7: RemoveindicatorHANAMODELFL(SAP HANAView)

 

Subsequently, theSPO could be activated withouterror.

Near Line Storage ( NLS ) Archival with SAP BW - Part 1

$
0
0

Part 1 of this blog series will cover some important aspects of NLS archival technology along with SAP BW .

 

Some important points to keep in mind before starting a data archival project are :

 

1) Choosing a NLS vendor . There are many SAP certified NLS vendors and a vendor should be chosen based on

requirement and product offered .

 

2) Correct sizing of the NLS box is important so that current and future data archival needs can be met.

 

3) Data to be archived : Archived data is always online for reporting while using NLS , but the reporting speed is

not comparable to reports being run on BWA or HANA appliances . Only data which is rarely accessed should be

put on NLS .

 

The next step is to determine which InfoCubes and the data volume that should be archived .

Based on BW statistics this decision can be made along with inputs from the business users . InfoCubes which

are big in size and infrequently used  are the best candidates for data archival .

 

Below is quick chart of the logic that can be  used to determine how a particular InfoCube should be archived

 

 

Archival Strategy.jpg

 

NLS archival is done using time slices and it is important that InfoCubes have time characteristics which can

be used for data archival .

 

InfoCubes which are refreshed on a daily basis ( Full load ) need not be archived . If SAP ECC system or

source system is also undergoing archival then only live data post archival in the SAP ECC or source system

will get loaded as part of the daily full loads to such InfoCubes and data which has been archived will not be

available in the SAP BW system for reporting .

 

InfoCubes which are delta enabled and delta loads don't bring data for time slices which have already been

archived are the best candidates for NLS archival . The reason here is that if you archive data from a cube for

a particular time slice and if the daily delta brings in data for the same time slice the data loads will fail .


If your daily delta loads to InfoCubes bring in data for time slices which have already been archived it's better to

create copy cubes and move the data to copy cubes for those particular time slices and archive those copy

cubes instead of the original cubes. Post validation , data can be deleted from original InfoCubes . Existing

multi providers and reports will have to be enhanced to include the new copy InfoCubes .

 

Every business requirement,landscape is different and there are multiple ways data archival can be done using NLS .

Above is one approach that can be used .


The second part of this series will try to cover some more aspects of NLS archival technology

 

Cheers !

Near Line Storage ( NLS ) Archival with SAP BW - Part 2

$
0
0

Part 2 of this blog series will cover some more important aspects of NLS technology

with SAP BW

 

1) A prerequisite of NLS data archival is to compress the InfoCubes that you want to

archive upto the latest data load request before archival . In many landscapes

this is normally done as part of maintenance activities but if not then this might be

a time consuming process to implement on all cubes and might impact project

timelines .

 

2) Many times a question is asked about NLS whether archived data on NLS can be

again restored back to the live SAP BW system . The answer is "Yes"  this is

possible but might lead to inconsistencies in the system and will defeat the purpose

of archival . Data reload option should be used only if there is a critical business need .

 

3) From a BW security perspective there are some prerequisites as well .    

For BW developers to perform archival related activities under developer role,

authorization object S_ARCHIVE needs to be assigned and activity code , area

and object should be '*' .

 

For End Users: To read archived data from NLS , users should have authorization Object

S_RS_ADMWB with Activity Code 03,16 assigned and restricted on Object RSADMWBOBJ


These prerequisites might be specific to NLS vendors as well and you need to provide

additional authorizations if mentioned in documentation provided by your NLS vendors .


4) Another important step which is a part of NLS archival is the creation of virtual

InfoCubes . To read the data from NLS you will need to create new virtual cubes

which will actually connect to NLS . If a query is run which needs to read data from

NLS then these virtual cubes will do the trick . You would need to modify the existing

Multi Providers and queries to include these virtual cubes so that data archived on

NLS is available for reporting .


Will cover some more aspects in the next part of this series


Cheers !

APD Query Tip

$
0
0

While working on one of the APD requirement we came across a strange issue.

 

As per the requirement, we wanted to add few new key figures to the APD query and hide few old ones.

 

We went ahead and added the new key figures and hid the other ones. We did the mapping and activated the APD.

 

When we started the testing, we realized the data was not coming in the correct order , It was strange and we couldn't figure out the reason.

 

We used trial and error method and finally found the solution.

 

We moved all the hidden key figures to the bottom of column box and the APD output got corrected. It was totally new and unexpected discovery which we were delighted to found, hence sharing it here.

 

Conclusion : Whenever you have hidden columns in APD query, make sure you move it to bottom of the column for correct data.

How to reorganize Process Chains log’s table

$
0
0

Questions about fast growing tables in SAP systems are very often asked in SCN forums. Such a tables consumes disk space and even more importantly processing large volumes of data in these tables slows down the system. This also true in area of BW systems. Therefore it is common for SAP Basis guys to do housekeeping on regular basis of these tables. There are SAP Notes available which deals with those tables and advise how to reorganize them. Perfect example of such a Note is 706478 - Preventing Basis tables from increasing considerably. There are many tables discussed in the Note per different areas and also BW areas.


One of them depicts table related to process chains log - RSPCLOGCHAIN.


The table holds logs of Process Chains. In large BW systems running many chains on daily basis table can increase its volume very easily. One of the tables involved in logging of process chains runs is RSPCLOGCHAIN. Regular way of how to get rid of the log of process chains run is to do it from transactions like RSPC or RSPC1. In the log view there is Delete functionality available in the menu:


RSPC1_delete_log.png


To you this functionality in automated way an ABAP report RSPC_LOG_DELETE needs to be utilized.  You can set up the job running this report on regular basis for particular chains with selection on logs from date/time or Log ID.

I found this report quite useful in BW housekeeping tasks.

RSPC_LOG_DELETE.png


PS: This blog is cross published on my personal blog site.


Runtime Analysis for DTPs/Transformations

$
0
0

Ever wondered where your DTP/Transformation spent it's time?

 

This is how you can find out:

 

At first you have to create a small programm like the following:

 

REPORT ZTEST.

DATA: R_DTP TYPE REF TO CL_RSBK_DTP.

                " loading the dtp

CALL METHOD CL_RSBK_DTP=>FACTORY

    EXPORTING

         " here you have to place the technical name of your dtp

         I_DTP  = 'DTP_4YIEJ....'

    RECEIVING

         R_R_DTP = R_DTP.

DATA L_R_REQUEST TYPE REF TO CL_RSBK_REQUEST.

L_R_REQUEST = R_DTP->CREATE_REQUEST( ).

 

" we have to run it in sync-mode (not in Batch)

L_R_REQUEST->SET_CTYPE( RSBC_C_CTYPE-SYNC ).

L_R_REQUEST->DOIT( ).

 

Now run this programm through Transaction SE30 or SAT :se30.png


After the execution of the DTP you will receive the result. In our case it was an select on the psp element. We created an index and the dtp needed only 2 minutes instead of 30 minutes.

 

ergebnis.png

Of course you can also simply call the transaction RSA1 through SAT (using the processing mode "serially in the dialog process (for debugging)"). But doing it that way you have to filter out the overhead in the performance log created by rsa1 and the data will not be written to the target (it's only simulated) so you might miss sometimes a bottleneck in your dtp/Transformation.

 

thanks for reading ...

"Schnelleinstieg in SAP Business Warehouse" available (german only)!

$
0
0

Hello altogether,

 

some days ago my first printed book was published by Espresso Tutorials and now it's also available as eBook:

 

Schnelleinstieg BW.jpg

This book is written for beginners in SAP Business Warehouse 7.3. It starts with a short introduction into Business Intelligence and Data Warehouses in common. Then it gives a short overview of Bill Inmons CIF as base for SAPs Business Warehouse and explains LSA and LSA++ in a short section.

 

The main part is an real world example, loading an IMS Health sample file into BW to get an answer for a business question.

As it's written using SAP Business Warehouse 7.3, I explain how you can use dataflow diagrams to build a very basic data flow to load the sample data.

Then it leads step by step with lots of screenshots through the data modelling. You begin to create your first characteristics, key figures. The next step is to create DataStoreObject, InfoCube and MultiProvider in a simplified multi-layer architecture. The next chapter is about ETL. The term itself is shortly explained. Then I show how to create transformations and DTPs. Finally I show how to put all DTPs together in process chains for master data and transactional data.

The last chapter is about Business Explorer. I explain how you can easily create a BEx query on top of your mutliprovider. With a few screenshots I show you can slice and dice through your data. Exceptions and conditions are explained shortly.

 

The most benefit is that you get some help how you can avoid the most common pitfalls in data modelling. It also gives some help in case of data load errors. Two other goodies are ABAPs:

1.) how to convert the keyfigure model into an account model

2.) how to easily create multiple key figures using BAPIs

 

The only disadvantage is that the book is available in german only. An english version is not planned at the moment.

 

Have fun reading it!

 

Cheers,

Jürgen

Tips on how to find the correct component when creating an Incident to SAP

$
0
0

Hello friends,

 

Today I would like to talk about a common mistake, that can cause major delays on processing times of incident: Incorrect component assignment of incident and how to identify the correct component for your SAP Incident.

 

I often notice that new incidents from customers are created under BW-BCT*, as BW-BCT has over 100 sub components, listing almost all Applications and generic sub areas from SAP products. This happens because the description of the sub components is not accurate in some cases and doesn't specify the area.

 

Please keep in mind, Components under BW-BCT are reserved for issues related to the installation and operation of SAP application extractors used to load data into SAP BW. Choosing the correct component is an important step during the incident creation because it ensures the fastest processing of your request.

 

We can see at the following image the example: Under path BW->BW-BCT there are over 100 sub components with simple descriptions like "Customer Relationship Management" or "Documentation". In reality those components are related with logic of data extraction for the ECC application that deals with "Customer Relationship Management" and "Documentation", BW-BCT-CRM and BW-BCT-DOC, respectively.

Capture.PNG

 

Another reason this happens is because customer selects the application for the Incident as BW,  which will automatically expand BW area for component selection, misleading customer to find the desired component based on the description, which will most likely be under BW-BCT list as it has almost all SAP Applications extraction logics under it.

 

How to find the correct component for my Incident?

In order for SAP to assign an engineer to work on a customer incident, the incident must be assigned to an active component, that is being monitored by SAP engineers.

Here are a few tips to identify which component is the one for your incident:


Note search

Perform a note search using the SAP SMP (Service Market Place) search tool using SAP xSearch or Note search.
Use key words related to your issue like the transaction code used to reproduce the issue, a table name and the name of the product as in the following example in xSearch, let's try an example: my BPC user has been locked due to incorrect login attempts and I'm trying to find a solution for it:

 

Capture2.PNG

 


Notice I'm searching for the text "user locked SU01", the most suited areas would be BC, GRC and SV.
Now look at the narrowing example:

 

Capture.PNG

Notice I narrow down the search by adding 'BPC' to the search text. This Narrows the results of Notes from over 280 to only 5, which are for EMP-BCT-NW.

 

Knowledge Base Article(KBA) or Note

Customers often find a note related to the incident being faced, or even create new incidents based on notes already provided by SAP.
As a general approach, the component where the Note or KBA was created for will be the most suited component to create a new incident:

 

KBA example:

Capture3.PNG

 

Note example:

Capture.PNG

.

 

You can find the Component of a KBA or Note under Header Data section, close to the bottom of the page.

 

Short Dump

Usually when customer faces short dump, you can see an application component assigned to the dump at the header, for example:

Capture.PNG

 

This might be usually the correct component for the incident, but not in all cases.
In order to identify the correct component, you should analyze the dump text description and check the programs and the code section where it stopped.


Check the function module/report or Transaction code component

This is usually the best method to identify the component responsible for working on the issue as it shows the exact component responsible for that part of the code.
There are several ways of doing that, I'm going to explain the one I believe most people will have authorization for and that I believe is easy to do:
1. Open the transaction you are facing the issue and navigate on it one step before reproducing the issue.
2.Go to context menu System->Status.

Capture4.PNG
3.Double click on the Program(Screen) value. The code for it will open.

4.Go to context Menu Goto->Attributes. A small popup will open.

Capture.PNG

5. Double click on the Package value. A new screen will open.
6.You will see the component at Application Component value.

Capture6.PNG

 


Checking some of the steps mentioned above should help you identify the correct component. However, there isn't a single formula for all issues. Each issue has to be carefully interpreted to find the appropriate component.
Sometimes even we at SAP have a difficult time to identify what component is correct for the issue. That is why, it is imperative that customers provide a Clear description of the issue with a Concrete example with description of the steps under Reproduction Steps section.

 

 

Do you have another tip on how to identify the corret component? If so, please let me know in the comment section.

Delete Change Log With Custom ABAP Program

$
0
0


Problem Statement/Business Scenario


As a best practice SAP suggests to delete the change log data in the DSOs but what if there is business logic in the transformation/update rule which consume the data from change log? There isn’t any standard process where selective change log requests can be deleted unless you delete them manually
Consider a scenario where we have a staging DSO which feeds data to 3 different cubes, out of the 3 cubes two are delta enabled and the third one is full request every day with snapshot data. If the CL data for full request isn’t deleted then it will grow exponentially in no time.

As per the below mentioned data model the staging DSO feed data to multiple cubes and among them is a snapshot cube to which full data load happen and there is no reason why we should retain the change log requests. In worst case scenario if we are to reload the data from staging DSO to Cube then it will not require the data in change log

Untitled.jpg

 

 


Purpose
To reduce the change log there are two ways of doing it – Either delete the requests manually or use an ABAP program to delete the change log with selections (full requests). SAP has provided the provision to delete the change log data via process chain but you cannot delete the requests selectively in case of the data being loaded to multiple data targets. To avoid all manual intervention a custom ABAP report can be created which will automate the process and it can be used as a weekly/monthly/yearly housekeeping activity.

ABAP Report Design
An ABAP program can be created to look for all the requests from table RSSELDONE which has the Request ID, Load Date, Info Package ID, System ID and Update Mode. Based on the info package id, load date and update mode all full requests can be identified and deleted from the change log table. Change Log table details can be got from DSO>Manage> Contents>Change Log.

 

Untitled.jpg

Example: If we have to retain full requests for 30 days alone then the program should delete all the data from change log based on the requests date which are older than 30 days.


Below is the snapshot of the table RSSELDONE which store all the requests

Untitled.jpg

 

Untitled.jpg

 

Thanks

Abhishek Shanbhogue

BI in the organizations

$
0
0

Today companies have expedientado growth in data flow, stored mainly in corporate information systems, and it is becoming more evident the need to obtain information from that source of data for making strategic dictions.

 

 

Information is the main source of knowledge, business growth is defined by the way processes are carried out, the companies are systems that are composed of different modules: finance, logistics, commercial and productive.

 

 

Each one of the modules or departments of a company need to have real-time information to make strategic tions based on the data you have in your corporate systems.

 

 

Business intelligence is not just a computer process is a fundamental component in the sustainable growth of the company that needs the complete knowledge of business processes.

SAP BW Upgrade: Pre and Post upgrade activities with answer to why?

$
0
0


Author(s): Saorabh Trilokinath Shivhare, Deepti Shetty.

Target readers: BW Consultant, BW Engineer, Basis Consultant.

 

Purpose of the document:

Scope of this doc. Is to cover pre and post BW upgrade activities. One can read what to do, how to do and why to do. Whenever SAP note is applicable it is mentioned and any other ref. for that point is also covered. These all points are based on our understanding and experience.


Introduction:

 What is BW upgrade?

Upgrade BW server from existing version to higher version, in this scenario upgraded from 7.0 to 7.4.


 Why we do upgrade?

Most of the advanced features are available only in higher version e.g. Semantic Partitioned, to collect the statistics of the Bex queries in the database tables.


 Why one should read this doc.?

There are already many materials available for BW upgrade checklist which explain how to do activity.  This doc. Also explain reason behind doing these activities. Along with below points read the last column for reason of Why we do?

Note:

1. Some of step may differ in pre and post upgrade activities depending on version to be upgrade, particular feature to test and environment.

2. These steps are considering BW upgrade from 7.X to 7.4 version.

 

Pre upgrade activities:

Sr. No.What to do?SAP NoteHow to do?Why we do?
1.Execute program RSUPGRCHECK to check DDIC (data dictionary) consistency.If it shows any inconsistency in info object, DSO, Info cube and transfer rule.  Activate inconsistent objects. Otherwise this may cause error due to non-activated DDIC tables in upgrade.

DDIC Objects are data dictionary objects. DDIC objects are tables, views, data type, type group, domain, search help.

 

ABAP dictionary describes and manages all data definitions used in the system.

 

This report checks whether DDIC tables that are needed for the BW meta data object, are actively existing.

 

The report points out the incorrect objects found in the log.

2Get list of all
Inactive objects.
To get list of inactive objects follow steps: Execute SE11 > OBJVERS = 'A' and OBJSTAT = 'INA'. Table name for respective meta data BW object is mentioned in ref. doc.Status of meta data objects should be the same pre and post upgrade.
3.Clean Up

No separate Transaction code or program as it done in several other activities, this includes

PSA clean up, delete log files, deleting objects like Infocubes and DSO's that are not required, deleting aggregates that are empty.

House-keeping activity.
4Stop RDA(Real-Time Data Acquisition) DaemonUse T Code RSRDATo do when using
Daemon services in BW. 
5Remove Temporary BI Tables & Check Invalid Temp tables and Temp Objects.

a) Check for Invalid Temporary Tables in SE14 using the menu path Extras -> Invalid Temp. Table.

b) Execute se38 > program: SAP_DROP_TMPTABLES.

 

The following objects can be deleted.

• Temporary Tables (Type 01/0P)

• Temporary Views (Type 03/07)

• Temporary Hierarchy Table (Type 02/08)

• Temporary SID Tables (06)

• Generated Temporary Reports

Part of Housekeeping activity. These are DB objects such as tables/ views/ triggers and so on. This have /BI0/0 name prefix.

6Check/Repair Status of Info Objects

Procedure

1. Log on to the SAP system.

2. Call transaction RSD1.

3. Choose Extras Repair InfoObjects (F8).

4. Choose Execute Repair.

5. Choose Expert Mode Select Objects.

6. On the following screen, in addition to the default checkbox selection, activate the following checkboxes:

- Check Generated Objects

- Activate Inconsistent InfoObjects.

- Deletion of DDIC/DB Objects

- Display Log

7. Execute the program.

This activity is performed so that there should not be erroneous object.

 

We perform consistency checks on the data and metadata stored in a BW system. This   tests the foreign key relationships between the tables of the extended star schema. (ref. From SAP help).

7Clean/delete the messages for error logsRun Report RSB_ANALYZE_ERRORLOG & RSBM_ERRORLOG_DELETEHouse-keeping activity.
8Check Master Data consistencyRun report
RSDMD_CHECKPRG_ALL
It means there is inconsistency / missing record in master data tables i.e. P/X and Dimension table of Cube where this Master data objects are used.
9Cleanup background
jobs
784969use program RSBTCDEL2 Background jobs are scheduled for various
activities these can be V3 job and any other jobs.
10Table SMSCMAID1413569Before you start the
Software Update Manager, check if you use the table SMSCMAID. If so, see SAP
Note 1413569 to avoid a termination of the Upgrade due to duplicate records in
the table.
Table SMSCMAID is used
for scheduling. If index is added in Table SMSCMAID follow this note.  
11Check for missing
TBLIG entries by running report RSTLIBG
783308Run report RSTLIBGThis report list out
inconsistencies in existing Info provider.
12Run SAP_INFOCUBE_INDEXES_REPAIR
in se38
In se38 > execute
SAP_INFOCUBE_INDEXES_REPAIR
To repair indexes of
info cube.
13Clear all logistic
data extractions. Clear delta queue.
Check in source system
for all source systems.
Clear delta queue and
logistic data extraction before upgrade as post upgrade underlying tables and
fields of extract structure may get changed.
14Check inactive
Transfer rules, IC, DSO, aggregates etc and list them.

Once you click on DSO, we need to check whether every
request in DSO is in green status or not. Before upgrade, we should make sure
that all the requests in the DSO should be in green state.

Here we need to check the status of
aggregates, according to SAP recommendation they should be activated before upgrade. Rolling up of aggregates is not necessary, it depends upon the
business requirements.

check only for post
upgrade ref.
15Repair inconsistent
info packages using RSBATCH
To check if Info package
in active state is inconsistent or not. If inconsistent then Repair before
upgrade.
16Take pre snapshots of
critical Bex queries
Check only for post
upgrade ref. Take prompt selection and o/p both screen shots.
17Converting Data
Classes of Info Cubes 
46272

 

1.  Call transaction SE16 and             check the table RSDCUBE.
2.  Select OBJVERS equal ‘M’, ‘A’; and check the entries for the fields DIMEDATCLS, CUBEDATCLS, ADIMDATCLS, and AGGRDATCLS.
All Info Cubes are listed with their assigned data classes.
3.  Compare the data classes with the naming conventions for data classes described in SAP Note 46272.

If you find incorrect data classes, correct them as follows:
1.  Set up a new data class as described in SAP Note 46272
2.  Execute the RSDG_DATCLS_ASSIGN report.
It allows you to transfer a group of Infocubes from old data class to the new data class and also assigns the Infocubes to the right data class.

Need to check whether cubes are assigned to correct data classes or not. If any DDART data classes (customer data classes) that do not follow the naming conventions described in SAP Note 46272,these data classes will be lost during upgrade due to which the tables of the InfoCube can't be activated.

An error message is displayed in technical settings due to the above situation.


Appendix:


Step No.2 (Pre upgrade) and 5 (Post upgrade).

 

TableBW meta data Objects.
RSUPDINFOupdate rule
RSTSTransfer Rule
RSISNInfo source
RSDSData source
RSDODSODSO
RSDCUBEInfo cube
RSQISETInfo set
RSDIOBJInfo object

 

 

Post upgrade activities:

 

Sr. NoWhat to do?SAP NoteHow to do?Why we do?
1Run the extractors in
the delta mode to verify any technical errors.
Do delta run for any
data source. Delta queue should be blank pre upgrade and once this data load
started, monitor for this data source in delta queue. 
To check delta  load happening correctly.
2Basic Application
Checks
Test different BW
flow as Master data object flow, Process chain and one transaction flow from
source system to Bex query end. 
Test different BW
objects.
3Sample Data Loads by
running critical master data and transaction data process chains
Test transaction and
master data load.
4Check RFCs and source
system connections.
Execute Tcode – RSA1
and replicate all the source systems.
Test RFC connection
to all source systems.
5Check Transfer rules,
Info Cube, Aggregates and activate inactive objects.
To get list of
inactive objects follow steps: Execute SE11 > OBJVERS = 'A' and OBJSTAT =
'INA'. Table name for respective meta data BW object is mentioned in ref. doc.
Compare list of
objects with pre upgrade and activate objects which got deactivated.
6Activate transfer
structures, Run the program RS_TRANSTRU_ACTIVATE_ALL
Activate transfer
Structures
7Reactivation of all
active update rules .Run the program RSAU_UPDR_REACTIVATE_ALL
Activate update rule
8Reactivation of all
7.X data sources. Run the program RSDS_DATASOURCE_ACTIVATE_ALL
Activate Data source
9Activation of
MultiProviders. Execute the program RSDG_MPRO_ACTIVATE
Activate
Multiproviders.
10Bex query output  validationCompare Pre snapshots
with post snapshots of the Bex query output taken.
Compare Bex queries
output pre and post upgrade.
11To correct operator
syntax error in custom program.

Use
EXTENDED_PROGRAM_CHECK FM for Custom Program.

 

http://scn.sap.com/thread/547645

12New Functionality
Checks

Created a SPO of a
Cube / DSO and loaded data into it. It worked fine.

 

http://scn.sap.com/docs/DOC-34893

Semantic Partitioned
is new feature introduced in 7.4. By means of this one can semantically
partitioned based on partition condition. Same gets stored in data targets
depending on partition condition.
13TADIR entries for
factviews.
Execute program
(SE38) - SAP_FACTVIEWS_RECREATE
TADIR is table to
maintain Directory of Repository Objects.
This includes all   dictionary
objects, ABAP programs etc. During upgrade if any fact/views dropped /missed.
This program recreates it. It is part of upgrade to 3.x. Since in most of the
landscape still 3.x flow is used. Missing TADIR entry error may occur.
14Ensure BI Object
consistency

1. Go to RSRV transaction

2. RSRV transaction is used to perform consistency check on the data stored in BW.

3. Elementary and Combined test can be performed.

 

http://wiki.scn.sap.com/wiki/pages/viewpage.action?pageId=153388081

During upgrade if some object got missed. This program will list out.

This needs to be done manually can be done for any BW meta data object.

It’s better to do it for the important cubes e.g. cube related to COPA.

15Ensure Aggregate and
Indexes of cube are active.

Ensure the aggregates and indexes of an
infocube are maintained and active as the case. Can be check for one cube.

 

http://scn.sap.com/thread/1932810

16Repair info objects

Procedure

1. Log on to the SAP system.

2. Call transaction RSD1.

3. Choose Extras Repair InfoObjects (F8).

4. Choose Execute Repair.

5. Choose Expert Mode Select Objects.

6. On the following screen, in addition to the default checkbox selection, activate the following checkboxes:

- Check Generated Objects

- Activate Inconsistent InfoObjects.

7. Execute the program.

This activity is performed so that there should not be erroneous object.

We perform consistency checks on the data and metadata stored in a BW system. This   tests the foreign key relationships between the tables of the extended star schema. (ref. From SAP help).

17Check the flag Record
Statistics while maintaining the OLAP parameters in the Tcode RSRCACHE

Do as mentioned.

Click on Cache parameter >Record Statistics.

It is an
architectural change as part of BW 7.4 we need to do it so that we can collect
the statistics of the Bex queries in the database tables.
18To identify and
repair inconsistencies in the PSA metadata.
1489064

1. Execute report RSAR_PSA_NEWDS_MAPPING_CHECK in transaction SE38

2. Execute with the repair flag unchecked. This will provide the list of all obsolete PSAs which are no longer used in segmented datasource.

3. Execute in the Repair mode then execute with the repair flag checked. This will inactivate all the obsolete. PSAs which are no longer used in segmented datasource.

 

http://scn.sap.com/thread/3214082

Deletion of obsolete PSA.
19ABAP program
Correction for loading Hierarchy.
1912874

As part of BW 7.4,
while loading hierarchy using Info package one may come across this error.
Implemented SAP note 1912874 - CALL_FUNCTION_NOT_FOUND while loading hierarchy.
This same error occur while executing bex query.

 

https://scn.sap.com/thread/639887

Original consistent
value returned by Data source was changed by conversion routine which is not
consistent with Conversion exit. 
Sol:  Check correct conversion
routine entered. Correct conversion routine or data.  Can also activate automatic conversion in
transfer rule.
20Code Inspector to
execute
1823174Execute program is
ZSAP_SCI_DELTA is implemented in se38.
In the BW Releases before 7.4, the maximum length of a characteristic value is limited to 60 characters. As of Release 7.4 SPS2, up to 250 characters are possible. To achieve this, the domain RSCHAVL was changed from CHAR 60 to SSTRING 1333. As a result, data elements that use the domain RSCHAVL are no longer possible (syntax errors) or they cause runtime errors in customer-specific programs. This SAP Note describes solution options for certain problem classes.

 


Assumptions:
1. These steps are considering BW upgrade from 7.X to 7.4 version.
Acknowledgements:
Contents are the general not mentioning any Business function or Technical BW meta data objects of Client source system.


References:
http://www.Help.sap.com
http://wiki.scn.sap.com
http://www.scn.sap.com

The Power of BW - My Experience

$
0
0

I would like to Share one of my experience where I felt BW can do wonders and can bring value to the customer. The below views are my personal view.

 

During one of the BW training session to the end user, I got a very strange question to answer. In my presentation I had covered all the major reports and all the navigational functionality of the reports.  Users were all happy and for the first time users it was a block buster movie. But this question was different from the normal lots. The question was “How can I know the reason why my Orders gave gone down for a particular customer and how these reports are going to help me to increase my efficiency". (My immediate thoughts was:  Do you expect the report to tell you that, you should be knowing it anyway  )

 

It took me some time to absorb the question and I had only one submission to make and that was these report can tell you where you are now , whether you are in track or not , but these report cannot answer why you are here.


After a few months I got a requirement of a new report on COPQ (Cost of poor quality). The specification was huge one and it covered all the aspects of
business wherever there was a possibility of incurring cost of poor quality. After a lot of deliberation & discussion a workable specification was finalized with business . It was impossible to capture all of parameters through the system. As the famous words of Albert Einstein goes “Not everything that can be counted counts, and not everything that counts can be counted”.

 

As the specification was cutting across different modules it had to be done in BW. This was the time I realized that BW analytics can do wonders. I could relate to the question that was previously asked to me on how reports can increase efficiency. During the development of this report I learnt BW can not only provide you with the analysis of what has happened but also can provide you with the indication of what has gone wrong. Of course there can be hundreds of reasons for that, but you can still provide these information using data which is already available in the system in a very readable format.

 

Let me explain to you with an example. If you are Sales manager and you see a downward trend of orders for a particular customer what will you expect the report to provide you. Firstly may be last 2 years trends of how the customer has placed orders and then may be whether we are able to provide the customer material on time or not (On time delivery analysis), who are my sales responsible for this customer, how many rejections or returns do we have from the that customer, how many times we have cancelled the invoices, are my receivables been taken care of and the list goes on. These data are available in the system in one or the other form. We only have to pick it up and present it in the right way.

 

Why BW?

 

The main reason why BW is needed is that only BW can bring in all these information together spread across different modules and hold it at one place.  Performance and easy navigation is the other reason. BW can provide the top to bottom analysis for KPI and this can be easily done through several method , some of them mentioned below

 

  1. Web Application Designer – You can select a customer through filter pane or drop down box. Analysis can be kept in different tab for each of the KPI.
  2. RRI – Report to Report Interphase is a good option if the user wants to drill down one report to another
  3. APD - There can be calculations which you will have to do on monthly basis or which will be difficult to calculate on run time
  4. Dashboards – Of course if Business Objects tools are used data can be represented more vibrantly.

      

With right functional specification, modeling and report designing BW can add more value to the business. It can highlight the pain areas using the transaction data and give right direction to the business.

 

This has been one of my experience with BW where I realized  what a wonder tool  BW is. I would like to know your experience where you felt the same. Please share as your comments

 

Best Regards

 

Gajesh


UNCAUGHT_EXCEPTION - PSA PARTITION REACHED MAXIMUM

$
0
0

Issue:

There are instances when Batch Jobs loading to data targets through PSA fail/do not complete. On checking the tRFCs we find there are tRFCs failing with error.

On deeper analysis we find there are short dumps in ST22 for each and every data packet that comes into BW for this particular data load.

 

Below is the snapshot from ST22.

 

Pic1.jpg

On looking at the Source Code Extract, it becomes evident that the data load fails while creating a new partition.

 

Pic2.JPG

 

On further researching it was discovered that the PSA table had already reached maximum number of partitions. As seen in the below snapshot, the partition number for this particular datasource (PSA) has already been reached to 9999.

 

Pic3.JPG

As per the SAP Note 1816538 - UNCAUGHT_EXCEPTION in PSA_PART_EXIST because of PSA Partitioning, adding partitions beyond 9999 is not permitted.

 

When the data load tries to insert records in the PSA it first creates a partition and then loads data into that partition.

 

 

Solution:

 

There might be times when dropping all data from the PSA would fix the issue (i.e. reset the partition number in the table RSTSODS).

However, when dropping data does not reset the partition number, we need to follow the below procedure:-

 

1. Make a note of the partition number for the PSA in Development system.

2. Drop all data from the PSA and Replicate the datasource.

3. Activate the data source and capture the data source along with the dependent objects (Update Rule/Transformation/Transfer Rules/Communication Structure).

4. Check the partition number for the PSA again in RSTSODS table - it should have been changed and reset to a lower number.

5. Before moving the change to QA system, make sure to drop all data from the PSA in QA and also make a note of the partition number in QA.

6. After successfully moving the change to QA, check for activation of dependent objects and then check the partition number in RSTSODS table - it should have been reset also.

7. Follow the same procedure before transporting to Production - Drop data from PSA, move the transport and then check the partition number in RSTSODS table - it should have been reset as shown in below snapshot.

 

Pic4.JPG

Using Trello with BI Projects

$
0
0

Since I used Trellofor the first time I was impressed with it and I started to use it even to control my personal activities. For me the resources available and the usability are fascinating, not to mention the constant improvement in the product promoted.

As a BI consultant I started to using it to track my activities in projects where I step. As each project has its particularity, I always try to adapt or create a specific Board for the project.

Today, with the help of lessons learned in the book "Efficient SAP NetWeaver BW Implementation and Upgrade Guide” (I recommend it to everyone who are involved with BW projects to read this book), I created a Board following a few things discussed in the book.

I tried to leave the Board as succinct as possible, so I avoided to create a lot of details and obstacles in it, because as I said, each project has its specific needs and dynamics.


Link to example Board.

Capturar.JPG

Let´s see the summary of the Board:


  • Reference: Space for all important and useful reference documentation for the project, such as: SAP Notes, Project Scope and etc.
  • Project Team: Here is defined the project team according to their role and responsibility. Also it contain the main contacts for each resource.
  • Analyze: Activities that are being analyzed by BW team to survey and creation of the necessary step for modeling documentation.
  • Modeling: Activities in SAP BW development environment through the creation of objects in various layers of the LSA.
  • Reporting: Activities in the development of SAP BO environment, such as creating connections, Webi Reports and Dashboards.
  • Testing & Go-live: Activities that are already in the final phase of testing and preparation for entry into production.
  • Project Management: Activities related to the management and monitoring of the project.


And you? Uses or has used the Trello? What did you think of the Board? Leave your comment with suggestions, improvements for us to improve the same.

Feel free to use it in your projects, it is public!

 

Links:

Trello Tour

Trello Development Board

Trello Resources

Attribute Change Run – Strategies and Parameters

$
0
0

Attribute change run (ACR) is the process of adjusting the aggregates whenever there is any change in the master data being used in that aggregate. In the process of loading data in BW, attribute change run plays a vital role post any master data attribute and hierarchy loads in order to get the correct data in reports.


Many a times during our batch loads, we encounter the situation where the ACR gets stuck for long without any progress. As a result, all the other processes which uses the same cube (eg. delete overlapping, create/delete indexes etc) running in other process chains, starts getting failed due to lock created by ACR on that cube.

 

As a workaround, we need to follow the below steps for correcting the failure.

  1. Identify and kill all the jobs and sub-jobs of that attribute change run. This can be done through SM37 or by using the program RSDDS_CHANGERUN_MONITOR in SE38.
  2. Deactivate the aggregates of that infocube manually from RSA1 for which job was stuck.
  3. Repeat the attribute change run either through process chain or manually through RSA1-> Tools -> Apply Hierarchy/Attribute Changes -> Monitor and Start terminated Change Runs.
  4. Check for change run job to finish (Should finish soon as aggregates are now deactivated).
  5. Repeat the other failed processes.
  6. Activate the deactivated aggregates (Only after checking if no other processes dependent on that cube is yet to finish and the available time slot, as the rebuilding of aggregates can take lot of time).


But the point here is that why do the ACR jobs get stuck for long and how can we avoid the failures and workarounds.


For adapting the aggregates to the changes, the change run works based on some strategies and parameters.


Strategies to adapt Aggregates:

There are 3 different strategies used to adapt aggregates in change run.

  1. Rebuild the aggregate (Adapt by Reconstruction)
  2. Delta Mode (Adapt by Delta)
  3. Rollup from previously adapted aggregate


Note: Infocubes having key figures with aggregation MIN/MAX needs to adapt the aggregates only by rebuilding them during change run.

 

BW: Parameter for Aggregates

Parameters for Aggregates can be set through below path:

SPRO >> SAP Reference IMG >> SAP Customizing Implementation Guide >> SAP NetWeaver >> Business Intelligence >> Performance Settings >> Parameters for Aggregates. (Tcode : RSCUSTV8)
IMG1.png

 

The parameters defined for the aggregates determines the adaptation strategy to be used while change run. Based on the threshold value and percentage of masterdata change, the reconstruction or delta strategy is decided.

IMG2.png

  • Limit with Delta: Threshold Value (0-99): Delta -> Reconstruct Aggregates

The value defined here determines aggregate adaptation strategy to be used for change run. If the percentage change in master data is greater than the threshold value mentioned, the Adapt by Reconstruction strategy is used which rebuild the aggregates else the Delta mode is used where the old records can be updated negatively and the new records positively.

 

  • Block Size

If the E or F table of the source for the aggregate structure is larger than the BLOCKSIZE parameter in table RSADMINC, the source is not all read at once, but is divided into blocks. This prevents an overflow of the temporary table space PSAPTEMP. A characteristic, with a value range divided into intervals, is used to divide the source into blocks. Only data from this type of interval is read from the source and written to the aggregate.

 

If no value is maintained for the BLOCKSIZE parameter in Customizing or if the value is 0, the default value of 100,000,000 is used. (exception: DB6 = 10,000,000).

 

  • Wait Time for Change Run Lock (in Minutes)

The waiting period (in minutes) specifies how long a process is to wait when it encounters a lock created by other parallel processes, such as for loading hierarchies or master data, another change run, or rolling up of aggregates.

If the system does not find a relevant lock, the change run waits the length of time specified here without creating its own lock.

 

For an example, below screenshot from the change run monitor screen shows the changed and total records for master data.
IMG3.png

Based on this, the percentage change in master data is calculated which is 11.98 and 11.73 percent.

This percent value is compared with the threshold value defined in “Limit with Delta” parameter for aggregates (10 in this case).

 

As the "Limit with Delta" parameter set here is less than the percent of master data changes, cube X and Y uses Rebuild (Adapt by Reconstruction) strategy for adapting its aggregates.


The standard value of “Limit with Delta” should be 20. However, setting this value purely depends on the volume of changes that occur in master data.It is recommended that the threshold value be kept above the maximum percentage change expected in the master data as rebuilding of aggregates could take
enormous time leading to the ACR running into deadlock or getting stuck.

 

Many of you might already be aware about these concepts, but for the people who run into errors and data load delays due to such issue, hope this helps.

 

Regards,

Nikhil

Process step completion in PC using program

$
0
0

Process chain monitoring

ST13 – Analysis and services tool lunch pad

Select tool name – BW-TOOL -> execute it

1.png

Select th process chain analysis

2.png

Click on execute

Give the date and time as per the project standards

3.png

Select the status green, yellow, red -> execute it.

4.png

Chain – PC technical name

LOG id – automatically system generated while activating thePC

Sub chains – how many chains included

Steps – how many steps completed

Day – sun/mon/xxx/

Date – date

Time – what time started the PC

Run time – total time taken for completing the PC

End date – end date

End time – what time PC completed

 

If steps will failure

 

If there is any failure in a particular step in process chain, we’ll correct it and will repeatthe process such that the further steps will get executed.

When to use function module:

Even after correcting that particular step in process chain if the status won’t go green,then we have to correct it manually and have to set the status of that
particular step to green manually such that remaining steps in that particular process chain will get executed.

In such cases we have to go for function module concept

Right click on the particular step which got failed and go to display messages as below

5.png

Go to chain tab

6.png

Copy paste the variant and instance into note pad

Go to se11 transaction code in separate session

Enter RSPCPROCESSLOG in database table and nter display

7.png

Go to contexts in the next screen as below

8.png

 

Enter process variants and Instance id as below and execute it

9.png

We will get the entire details of theparticular step like time stamp, job id, status everything related to thatparticular step as below

10.png

Go to another new session and enter
transaction code SE37 for entering function module

11.png

Enter RSPC_PROCESS_FINISHin that function moduleand execute it

We will  get the below screen, enter every particular detail of that screen by taking RSPCPROCESSLOG screen as reference

12.png

13.png

Set the state as G -  Successfuly completed such that we aremanually putting the status of that particular step to green and execute it. 

Thanks,

Phani.

SAP BW on HANA 7.4 SP8 in Berlin

$
0
0

In Berlin I attended  two SAP HANA on BW sessions. On Wednesday there was a session on the SAP BW product + roadmap session by Lothar Henkes and on Thursday there was an End to End scenario session by again Lothar Henkes but mainly Marc Harz. Additionally I started the opensap course this week after the Teched&&Dcode where I saw a familiar face… Hello Marc!

In this blog I share some of my thoughts on what I saw in Berlin. As I found some things in Opensap quite important in reiterating what SAP BW is supposed to do I included a paragraph on that subject.

 

BW_Overview.jpg

Source: SAP.

 

In the image above you see the main points of the new things that have been developed in SAP BW 7.4 up until SP8.

In the first presentation the new developments where grouped in a couple of themes. As I was actually impressed at the time at the structured manner of the presentation I will follow that structure and reference the other sessions.

SAP BW

But before we dive into the things I heard in Berlin, I will point at the opensap course that just started a couple of days ago. As SAP BW is clearly going through some rapid changes it was good to go back and look at what the goal of the application is. In one of the first slides in week 1 this overview was givern:

opensap_overview.jpg
Source: SAP.

SAP BW is an application on top of a database. What it wants to do is help you to manage the data warehouse.

As it is an application BW basically lays an abstraction layer over the database. In the past due to all kinds of technical constraints BW felt more like a technical exercise to get performance or to get it to work period.

Now that HANA is doing the heavy lifting, BW seems to get its focus back to what it is originally meant to do. Create a business layer over your database to easier build a data warehouse.

 

You can find the course here: SAP Business Warehouse powered by SAP HANA - Marc Hartz and Ulrich Christ

Try it. It is free and the first week looked very promising.

 

 

Virtual Datawarehouse

The virtual Data warehouse is a layer that is able to  work because of SAP HANA. Only with SAP HANA you are able to get the performance you need to use virtual layers. What SAP BW delivers is ways to create virtual objects that leverage the technology. Utilizing this technology you can think of creating separate views for different departments without having to copy the data. Additionally it creates more flexibility as in the past data reloading was a big part of the time needed to get changes done.

BW_VIRT.jpg
Source: SAP.

In SP8 you have two main objects for your virtual view. The Composite Provider and the Open ODS view. The latter is meant for virtual access to external sources.

The Composite provider looks like the main tool for modelling. It enables you to combine info providers with JOINS (merging) or UNIONS (appending). You can even use other composite providers as source. Note however that this currently is UNION only.

Basically this means that you theoretically can store data only once and build virtual layer upon layer on top of that.

Personally I think that you will keep some kind of staging area around, when you don’t know if the source system is going to retain the data, use transformations to create a persistent single version of the truth, things like cleansing and checking the data,  and from there go with virtual layers.

 

Simplification

The picture seems clear enough. From a large number of objects we go back to only a couple :

DS_simplification.jpg

Source: SAP.

 

 

I was really enthusiastic about this and now after a few days I still am. However I do need warn you that there still is a lot of complexity hidden within the objects. The Advanced Datastore Object (ADSO) for example has three checkboxes that can be set on independently of each other. This checkboxes determine which of the three tables underneath the application layer will actually be used. This means that you have 2^3 16 different setups to choose from. In the presentation there was a mention of templates for different situations. That should help in that case. From an architecture point of view You have to look at the options and determine which options should be used in which circumstances.

All in All it looks good. In the End to End session Marc Harz showed us a live demo where he showed the editor of the composite provider.

screenshot_composite.jpg

Source: SAP.

This looks a lot better than the old editors for multiproviders. Now with the ability to use compositeprovider as source for other compositeproviders you can create simple building blocks that together build your application.

 

Big Data

For Big data management SAP BW differentiates between three types of data based on the amount of usage: Hot, warm and cold. Hot data will be in HANA in memory, warm data will be in HANA, but on disk and finally cold data is stored in Near line storage on separate servers.

This should help you to achieve a more efficient usage as you’re only investing in expensive equipment for the hottest data and can keep a more modest budget for the rest.

BW_datamanagement.jpg

Source:SAP.

In this image you see an example how you could manage this. Basically you have different persistent objects that do or don’t reside in memory. Based on the usage you move the less used data to the warm objects. From these objects you get a flow to the near-line storage based on age and /or usage.

 

Performance

To be short. Run on HANA and hold on for dear life ;-)

Basically SAP BW was a 2 tier system, which you had to manage carefully to keep performance. A lot of ABAP code was all about collecting a lot of data and changing it on the application layer. As a BW consultant you often used ABAP just to increase the performance a bit. For example before the improved master data lookup you actually avoided the standard transformation and used abap in the start routine to collect a lot of data in a variable so in the transformation you could use an abap function to read the variable.

 

Now with BW on HANA everything gets pushed down from the application server to HANA. This means that for performance you are best of to avoid your own coding as much as possible. Standard transformations can be pushed down to HANA. Your own creations less so. For these the old transfer to application layer and back still goes.

 

In the presentation note 2063449 was mentioned. This note will tell you what has been pushed down and what is still to do. But as a rule of thumb develop like it is already pushed down, eventually it will be pushed down and if you already did it the right way you won’t have to redo it to get all the performance.

Planning

Also here a pushdown to HANA is taking place. The PAK should be feature complete now in comparison to BW-IP. Furthermore the FOX formula handling is improved and you can use a composite provider for planning scenario based on unions.

That you are also able to enter comments is a very nice feature. Customers for design studio are often asking for precisely this feature.

 

Conclusion

 

SAP BW is reinventing itself and focusses on its core function. Offering an application or business layer over your database. HANA is the driving force behind this by providing the heavy lifting needed. In the future more and more functions will be done on HANA itself. I am just wondering how they will balance between the customers on HANA and those on other databases.

Viewing all 151 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>