Ansicht
Dokumentation

RBMVSHOW - Manage Data Transfer by Direct Input

RBMVSHOW - Manage Data Transfer by Direct Input

CPI1466 during Backup   PERFORM Short Reference  
This documentation is copyright by SAP AG.
SAP E-Book

Description

You use this program to manage the transfer of data by direct input (DI). Three applications currently support this method:

  • FI documents
  • material master data
  • substance data for PP-SHE

The DI method works as follows: After making a check, the application writes the data from an input file to the corresponding SAP tables direct. This makes it less circuitous than the batch input method where screens are processed in the background. As a result, the DI method is much faster.

To ensure data integrity, each DI program has a restart mechanism. Using a few function modules, it writes synchronization information to table TBIST. In the event of an abend, this ensures that the transfer of data can be restarted at the correct point, preventing, for example, documents from being posted twice or not being posted at all (provided no business errors occur; see below).

You may only start the transfer of data by direct input from RBMVSHOW / transaction BMV0. It is always executed in the background. Though you can use SM36/SM37 to directly manipulate the transfer of data, data integrity is no longer ensured.

Procedure

For each application in the Applications menu, you will find an option corresponding to each of the steps described here.

  1. Define a variant:
You must first define a variant with the appropriate parameters for the program reading the data (RFBIBL00 for FI documents, RMDATIND for material master data).
  1. Define the job:
To define the control data for a job, choose Job administration -> Define job. For each client, a job is uniquely identified by its name. You have to specify the report name (for example, RMDATIND) and a variant name. You can also specify a server for the background job. You can overwrite this server later. In addition, you can specify a user name under which the direct input job runs. This user then requires all authorizations necessary for the application.
  1. Start new job:
To start the job you have just defined (F4 on the first input field), choose Job administration -> Start new job. If you previously specified a background server or user name, you can change it here. On the following screens, you specify the print parameters and start time.
  1. Monitor data transfer:
To display the current transfer status, choose "Update" and the respective job (double-click). This enables you to follow what is happening. If the job terminates, for example, due to a database error, it is given the status "Background: job terminated". In this case, you analyze the error (by choosing "Display log") and eliminate the cause. You then select Job administration -> Restart job. The job is then restarted and the transfer of data continued at the point at which it terminated.
  1. Status of a completed transfer: There are two possibilities here:
If the data transfer was 100% successful, it is given status "C" for "Executed completely without (logical) errors".
If, however, LOGICAL errors, that is, business errors, occurred (for example, a required entry was not made), the status is "E" for "Finished though logical errors occurred". In this case, the way you proceed depends on the application:
  • In the case of FI documents, a batch input session has been created for the incorrect data. You can process this session in the foreground.

  • For material master records, an error file has been created that you can display and process using RMDATIND (or by choosing Applications -> Industry matl master -> Display log. errors). After processing this file (or, for example, correcting a setting in Customizing), you can initiate error correction by choosing Error processing -> Correct errors. The data is then read from the processed file and not from the original file. After making a corresponding check, the system attempts to post the data.

  • If you subsequently maintain the data online using the usual maintenance transactions, you can set the transfer to "Completed without errors" by choosing Error processing -> Set to "completed".

  1. Completed data transfers appear with a checkbox in the list and can be deleted. After completing a transfer, you can reuse the job name if, for example, the input file has been replaced and the other parameters remain the same.

The screen displaying the individual jobs is thus divided in two: the current (running or terminated) transfers always appear at the top, with the completed jobs below (recognizable by a checkbox that you use only to delete the entry). New jobs created with Job administration -> Define job do not appear in the display, only

  • current jobs (background job running/planned/ready)

  • terminated jobs (background job terminated)

  • completed jobs (background job completed)

As of Release 3.0F, you can use the History pushbutton to obtain a history of the transfers made for a job definition. On the following screen, you can display details or the job log. Jobs started before upgrading to Release 3.0F are displayed incompletely.

Periodic Jobs

If you want to operate a periodic interface by means of direct input, please proceed as follows:

  • In the job definition, identify a job as "Allowed periodically".
  • Schedule the job by choosing Start data transfer by direct input.

If you want to include additional steps in the background job, save the start dates for the job by choosing Start data transfer by direct input. Choose a time in the future as the starting time. After saving your data, call up transaction SM37 and edit the step list for the job. The job name is that from the job definition.

Important

Check constantly whether, with periodic scheduling, the job has been completed correctly. If a job terminates, the successor job will not start. By choosing History, you can check what jobs have started periodically for a job definition. If a job does not start because its predecessor terminated, this is indicated by "Invalid" in the list displayed.
If you want to restart a terminated job, be sure to check that the input file is still the correct one. With periodic jobs, the input file is generally exchanged periodically. If you choose Restart for a terminated job, and the input file has meanwhile been exchanged, it is very likely that the data integrity is corrupted.

Expert Menu

You can use the expert menu to manipulate any job in almost any way you like. As the name suggests, you should know precisely what you are doing before you try out one of the menu options since you will otherwise jeopardize data integrity. NEVER manipulate a running transfer since the synchronization table is not locked, posing the risk of a lost update.

You have the following options:

  • cancelling a running background job
  • changing the status of a job (very risky)
  • changing various control data:
  • changing the server on which the job is run when next restarted

  • changing the current record number (very risky)

  • changing the external number (this is merely for information and can therefore be manipulated at will)

  • changing the number of logical errors and the last incorrect number

  • moving jobs from the category "Current" to the category "Completed" and vice versa (possibly very risky)
  • switching on/off the (fairly frequent) confirmation prompts
  • presetting print parameters if you wish to run several jobs with identical print parameters

Authorization Check

Authorization object S_BDC_MONI is checked, amongst others.

Field BDCGROUPID is checked against the job name, field BDCAKTI against the values

  • ABTC when starting, restarting, etc. jobs
  • DELE when deleting entries
  • REOG when manipulating jobs (expert menu)
  • LOCK when unlocking a job

If you want to start a job under another user name, you require authorization S_BTCH_NAM. The characteristic value required in field BTCUNAME is the user name under which the job is to run.

You also require all Basis Component authorizations necessary to run jobs in background mode.

Technical Documentation

If you want to write a data transfer program yourself that uses function modules and that can be restarted after an interruption, be sure to note the following:

  • For the data transfer program to be reliably restarted, it is essential for the COMMIT points to be known precisely.
    A function module must be called up directly before a COMMIT. There must be no implicit COMMITs, for example, by a screen change resulting from a CALL transaction.
  • You must start your self-written data transfer program only using transaction BMV0 (or by calling up a suitable function module (see below)). If it is started online or via transaction SM37, it must either terminate or at least issue a warning that it cannot be restarted.

You can use program RMDATTST as a reference for programming. It contains all necessary function module calls. The idea is to write synchronization points. The data transfer program itself manages a counter, which is stored as a synchronization point via BI_END_AKT_NUMBER before the Commit.

The synchronization point (that is, the counter status) is used to find the correct place for a restart after a program termination (that is, the place from which update must resume). Set the counter high enough to ensure that the correct place can be found.

  • Function module BI_GET_STARTING_NUMBER
This must be the first function module you call. It delivers the last counter status written with BI_END_AKT_NUMBER. When you restart (parameter ACTION = 'R'), an update can be resumed precisely for this counter status.
New as of Release 3.0F: The function module no longer has any IMPORT parameters. Under JOBID, you get an 8-digit ID which identifies the job's data. Error data should be stored under this ID. You should also write a message with this ID to the job log.
The parameter ACTION supplies:
' ': start for the first time
'R': restarted
'E': restarted to correct errors

  • Function module BI_END_AKT_NUMBER
This function module is called up after processing a COMMIT unit. It internally notes the current counter status. It must be called up immediately before a COMMIT. The number of logical errors in this step and the JOBID under JOBNAME (delivered by BI_GET_STARTING_NUMBER) are passed to this function module.

  • Function module BI_CLOSE_ENTRY
After processing all of the records, this function module is called up. It sets the processing to "completed". Logical errors may or may not have occurred. This is noted.
Business errors are described as logical errors, for example, an entry was not made in a required field, preventing data from being updated. These data records are then logically incorrect. You should include a function that can be used to display these records.
In the case of FI documents (RFBIBL00), a batch input session is, for example, created from the incorrect records. The user can later process it in the foreground using SM35.
By contrast, in the case of material master records (RMDATIND), there is a maintenance tool for the recovery of incorrect records that are saved by means of EXPORT TO DATABASE. If RMDATIND is then called up again for error recovery, function module BI_GET_STARTING_NUMBER recognizes and reports this. RMDATIND imports the data with IMPORT FORM DATABASE and attempts to correct it. When the errors have been corrected, either function module BI_ERRORS_SOLVED or BI_ERRORS_CLOSE is called up. It informs the job management of the number of errors still left.

If you want to avoid users having to use transaction BMV0, you can use function module BI_START_JOB to start a data transfer job instead. It checks whether the requirements are met for a job to be started (for example, that a job with the same name is not already running) and fills the control tables accordingly. However, if a job terminates, BMV0 must be used to restart it.






rdisp/max_wprun_time - Maximum work process run time   SUBST_MERGE_LIST - merge external lists to one complete list with #if... logic for R3up  
This documentation is copyright by SAP AG.

Length: 14434 Date: 20240520 Time: 115339     sap01-206 ( 265 ms )