ABAP Dataflow & Transports in BODS

 See more in our ebook

 

 

This  post introduces the SAP BusinessObjects Data Services objects for extracting data from SAP application sources:

  •  SAP Applications datastores
  •  ABAP data flows
  •  transports

Note:

The procedures in this section require that the software has the ability to connect to an SAP server.
The sample tables provided with Data Services that are used in these procedures do not work with all versions of SAP because the structure of standard SAP tables varies in different versions.

Defining an SAP Applications datastore

1. In the local object library, click the Datastores tab.
2. Right-click inside the blank space and click New.
The "Datastore Editor" dialog box opens.
3. In the Datastore name field, type SAP_DS.

This name identifies the database connection inside the software.
4. In the Datastore type list, click SAP Applications to specify the datastore connection path to the database.
5. In the Application server field, type the name of the remote SAP Applications computer (host) to which Data Services connects.
6. Enter the user name and password for the SAP applications Server.
7. Click OK.
Data Services saves your SAP application database connection to the metadata repository.

To import metadata for individual SAP application source tables

You will now import metadata from SAP application tables into Data Services.
1. In the object library on the Datastores tab, right-click SAP_DS and click Import by Name.
2. In the Type list, click Table.
3. In the Name box, type KNA1.
4. Click OK.
5. Repeat steps 1 through 4 for the other SAP tables you will need:
• MAKT
• MARA
• VBAK
• VBAP
• VBUP
6. In the object library, expand SAP_DS, then expand Tables to display the imported SAP tables:

Repopulating the customer dimension table

You will now repopulate the customer dimension table from SAP. First you will define a data flow that calls an ABAP data flow to extract SAP data and load it into the customer dimension table.

Adding the SAP_CustDim job, work flow, and data flow

To add the job
1. Open the project.
2. Right-click the project name and click New Batch Job.
3. Rename the job JOB_SAP_CustDim.

To add the work flow
1. Open JOB_SAP_CustDim by clicking on its name in the project area.
2. Add a new work flow and name it WF_SAP_CustDim.

To add the data flow
1. Click the WF_SAP_CustDim work flow name to open it.
2. Add a new data flow to the work flow definition and name it DF_SAP_CustDim.
3. Click the name of the data flow to open its definition.

Defining the SAP_CustDim data flow
So far, you have followed similar procedures as in previous sections. Now, however, because you are loading data from an SAP application, the procedure to define the data flow is somewhat different.
The data flow for building the dimension table will consist of the following objects:
• A data flow to read data from an SAP table and perform any other operations that need to happen in SAP.
• The target table into which the customer dimension data loads.

To name the source and target tables

1. Ensure the DF_SAP_CustDim data flow is open in the workspace.
2. Click the ABAP data flow icon on the tool palette.
3. Click in the workspace to place the ABAP data flow.
A window appears prompting you for the SAP applications properties.
4. On the Options tab, complete the following fields:
a. Datastore: Click SAP_DS.
b. Generated ABAP file name: Specify the name of the file that will contain the ABAP code generated by the software. This file will be created in the Generated ABAP directory specified in the SAP_DS datastore.
c. ABAP program name: This is the ABAP program name that will be uploaded into SAP applications. It must begin with the letter Y or Z and not exceed eight characters.
d. Job name: Type SAP_CustDim. This is the job name that will run in SAP applications.

5. Click OK.
A representation of the ABAP data flow appears in the workspace and the project area.
6. Name the data flow DF_ABAP_CustDim.
7. In the object library, click the Datastores tab and for the Target_DS datastore, expand the Tables list.
8. Drag the CUST_DIM table into the workspace, drop it to the right of the ABAP data flow DF_ABAP_CustDim, and click Make Target from the shortcut menu.

Defining the ABAP data flow

There are two more components that you must add before all of the objects necessary to populate the customer dimension table are present.
At this point you must:
• Designate the source table named KNA1
• Define the query that specifies what data to extract
• Define the data transport object into which SAP applications writes the resulting data set
• Define the order of execution

Define the column properties in the query:

To define the details of the data transport

A data transport is a step in an ABAP data flow that defines a target in which to store the data set extracted during the flow. A data transport defines a staging file for data extracted from SAP applications.
1. In the workspace, click the name of the data transport to open the ABAP Data File Option Editor.
2. In the File Name box, type the output file name cust_dim.dat.
This is the name of the file that will store the data set produced by the ABAP data flow. The full path name for this file is determined by concatenating the path specified for the SAP BusinessObjects
Data Services path to the shared directory field in the Properties tab of the definition window
(specified when you created the SAP_DS datastore) with the file name supplied here.
3. Click the Replace File option.
This causes the file to truncate each time this data flow executes.
4. Click the Back arrow in the icon bar to return to the data flow.

Validate and execute the job. Below is the list of errors that you may encounter in an ABAP dataflow

Error  Probable cause  Solution
Cannot open ABAP output file Lack of permissions for Job Server service account. Open the Services Control Panel, double-click the Data Services service, and select a user account with permissions to the working folder on the SAP server.
     
Cannot create ABAP output file specified incorrectly. Working directory on SAP server In the object library on the Datastores tab, right-click the SAP datastore and click Edit. Review the Working Directory on SAP Server box  and edit if necessary. Verify by copying the path and pasting it into the Start > Run dialog box and executing. This should open a window to the working directory on the SAP server.

Below is a sample data flow which involves a join condition from two SAP tables to populate material master data in DS-

The resulting relationship appears in the From clause text box:
(MARA.MATNR = MAKT.MATNR)

In the Smart Editor, type the following text. Use all uppercase, as shown. This statement filters the material descriptions by language and brings the only English-language material records into the
target.

AND (SPRAS = 'E')

Map the source and output columns, rename the target columns, verify data types and add descriptions.

To define data transport details

1. Click the name of the data transport object to open the ABAP Data File Option Editor.
2. Enter the output file name in the File Name box:
mtrl_dim.dat
This is where the data set produced by the data flow will be stored.
3. Click Replace File.
This truncates the file each time the data flow runs.
4. Click the Back arrow in the icon bar.

Validate the dataflow and execute the job.

Defining a transactional data flow

The data flow instructions for building the fact table are the same as the previous section's exercise.
The data flow consists of the following objects:
• An ABAP data flow to read data from an SAP application table and perform any other operations that need to happen in the SAP database
• The target file into which the customer dimension data loads

To name the source and target tables

1. Click the name of the data flow to open the data flow definition.
2. Click the ABAP data flow icon in the tool palette.
3. Click a location in the workspace to place the data flow.
A dialog box appears prompting you for properties.
4. Complete the following fields:
a. Datastore: Click SAP_DS.

b. Generated ABAP file name: Specify the name of the file that will contain the ABAP code that software generates. This file will be created in the Generated ABAP directory specified in the
SAP_DS datastore (for example SalesFact.aba).
c. ABAP program name: This is the ABAP program name that will be uploaded into SAP. It must begin with the letter z and not exceed eight characters (for example zSaleFac).
d. Job name: This is the job name that will run in the SAP application. Rename it SAP_SalesFact.
5. Click the General tab and name the data flow DF_SAP_SalesFact.
6. Click OK.

A representation of the ABAP data flow appears in the workspace and project area.

7. In the object library, click the Datastores tab and for the Target_DS datastore, expand the Tables list.
8. Drag the SALES_FACT table into the workspace, drop it to the right of the ABAP data flow DF_SAP_SalesFact, and click Make Target from the shortcut menu.
9. Connect the right side of DF_SAP_SalesFact to the left side of the SALES_FACT table.

Defining the ABAP data flow

Next you will designate two source tables named VBAP and VBAK.
Then you will:
• Define the query that joins the two tables and determines which columns to read data from
• Define a column to reflect a value in another table using a lookup
• Define the data transport
• Define the order of execution

To define query details including the join between source tables

1. Open the query editor.
2. In the FROM tab, in the Join pairs area, select VBAP from the Left drop-down list.
3. Select VBAK from the Right drop-down list.
The relationship between the VBAP and VBAK sales tables is defined by the common column VBELN, which is the sales document number. The Propose Join option specifies that relationship based on the primary keys.
The resulting relationship is the following:

VBAP.VBELN = VBAK.VBELN

4. Click the smart editor icon.
5. In the Smart Editor, type the following text. Use all uppercase, as shown. This statement filters the sales orders by date and brings one year's sales orders into the target.

AND ((AUDAT >= '19970101') AND (AUDAT <= '19971231'))

6. Click OK.
7. Map source columns to target columns by dragging each column from the source schema (Schema In) to the target schema (Schema Out).

The sales fact table contains a column named ord_status that indicates an order's status. The order status value comes from a table that is not one of the input tables. To implement this, add a column to the output table and map its value by using a function call that looks up the order status value.

To add a column that reflects a value in another table using a lookup

1. In the query editor, right-click the target schema query name and click New Output Column.
2. Define the column as follows:
Name: ord_status
Data type: varchar(1)
Description: Order item status
Click OK.
3. The order item status comes from a value in the VBUP table. To indicate the correct row and value to populate the ord_status column, you define a lookup function in the Mapping text box.
Click the ord_status column in the target schema.
4. On the Mapping tab, click Functions.
5. In the Function editor, click Lookup_Function from Function Categories.
6. Click lookup from the Function Name pane.
7. Click Next.
8. Type the values to define the lookup.
The value for the ord_status column comes from the GBSTA column in the VBUP table. The value in this column indicates the status of a specific item in the sales document. Determining the correct
order item status requires both an order number and an item number.

The function editor provides fields for only one dependency. Use the values in the table below to complete the function editor. Later steps explain how to modify the result to include the second (or additional) dependency.
The LOOKUP function is case sensitive; use the exact cases listed in the following table.

Argument  Value  Description
Translate table SAP_DS..VBUP The table in which to look up values. (For non-SAP tables, you would specify a table owner between datastore and the table names.)
Result column  GBSTA The column from VBUP containing the value that  you want to list in the target.
Default value  'none' The value used if the lookup isn't successful. (Not used for SAP applications).
Cache spec NO_CACHE' Whether to cache the table
Compare column VBELN The document number in the lookup table.
Expression VBAK.VBELN The document number in the input (source) schema.

9. Click Finish.
The LOOKUP expression displays in the Mapping text box:

lookup(SAP_DS..VBUP, GBSTA, 'none', 'NO_CACHE', VBELN, VBAK.VBELN)

10. Add lookup values to the mapping expression.
The LOOKUP function can process any number of comparison value pairs. To include the dependency on the item number to the lookup expression, add the item number column from the translation table and the item number column from the input (source) schema as follows:

POSNR, VBAP.POSNR

The final lookup function is as follows:

lookup(SAP_DS..VBUP, GBSTA, 'none', 'NO_CACHE', VBELN, VBAK.VBELN, POSNR, VBAP.POSNR)

11. Click the Back arrow in the icon bar.

To define data transport details

1. Click the name of the data transport object to open its editor.
2. Enter the output file name in the File name box:

sales_fact.dat

3. Click Replace File.
This truncates the file each time the data flow runs.
4. Click the Back arrow in the icon bar.

Validate and execute the job.

New Terms and their Meanings-

ABAP data flow - A Data Services object that generates ABAP to extract data from SAP application sources.
An ABAP data flow can only be defined in a data flow definition.

Data transport - A Data Services object that stores the output of an ABAP data flow.

 

See more in our ebook

Comments

Popular posts from this blog

Domains and Data Elements

Sample ABAP program for Updating notepad file data to Internal table and Sending it to Application server

OPEN SQL EXAMPLES IN ABAP – PART 2