Quality Assurance and Testing Services

Quality Assurance and Testing Services (QATS) reduces the time and effort in testing so you can get your product to market and exceed customers’ expectations.

ERP Testing

Enterprise Resource Planning Testing | Packaged Application Testing | Business Intelligence Testing.

Agile Software Testinge

Agile QA Testing, Agile Test Tools, Agile Testing, Agile Life Cycle, Agile Testing Methodology, Agile Performance Testing.

Mobile Application Testing

Mobile Application Testing, Mobile Testing Tools, Mobile Software Testing, Mobile Cloud Testing.

QATS Strategy Consulting

Testing Strategy | Testing Process Optimization | Testing Automation Consulting | Strategy Consulting

Showing posts with label SAP. Show all posts
Showing posts with label SAP. Show all posts

Wednesday, 4 September 2013

Data Staging for SAP Conversion

Purpose

The white paper lists and describes steps needed to perform data conversion for SAP from the staging point of view.

Goals and Objectives

The goal is to enable the staging developers during Data conversion process and help them verify the data converted before it goes into SAP.

Pre-requisites for Developers

The goal is to enable the staging developers during Data conversion process and help them verify the data converted before it goes into SAP.

Developers working on the Staging part should have the following pre requisites.
  • Knowledge of Oracle
  • SQLPL/SQL
  • SQL Loader Utility
  • Knowledge of MS Excel
  • Familiarity with TOAD
Overview

Data conversion process involves converting/processing the data received from the legacy applications/systems and transforming it into desired output structure of SAP as per the specifications provided.

Functional specification documents provide details about the business rules that are to be applied on the data before it goes into SAP. Staging team does the work of applying those business rules by writing stored procedures in ORACLE.

Data conversion process

Steps to be followed
  • Understanding and analyzing the requirements of conversion
  • Development of procedures for conversion
  • Import process
  • Processing data
  • Versioning
  • Exception Reports
Understanding and analyzing the requirements of conversion
  • Understanding business rules (from specs) to be applied and analyzing data received from legacy systems
  • Getting clarifications from the client on any of the doubts that may arise from step 1
  • Preparing a mapping document (using MS excel); listing all the input fields from the data received and output fields as per the structure required for a particular conversion
  • Ensuring that the mapping done is correct by getting it verified from a business user
 Development of procedures for conversion

Once a mapping document is in place and has been verified by business users, the actual development of code and import process of data is initiated for a particular conversion.

Import process

SQL Loader utility is used to import data into ORACLE tables. Data received from a legacy system could be in various forms. For e.g. it could either be in a tab delimited, CSV or can even be in an excel spreadsheet format. It is important to review the data received is in correct format and ensure that no field / information is missing. Process to import data may differ depending upon the format in which data has been received.

Steps in general that are needed to be performed for the import process:

In case the data received is in tab delimited format, MS Excel can be used to prepare the file before it is used with the SQL Loader tool. Although data can be imported to ORACLE as is, but sometimes the data received is not directly importable into ORACLE tables and has to be converted into a format which is acceptable to the ORACLE.

Select Import Data option available under Data->Import External Data menu option.

  • Step 1 will open up an Open file dialog window
  • Select a file (Excel, tab delimited or CSV format) containing data using an appropriate path and follow the steps of the Import wizard
Note: An important thing that has to be kept in mind before importing the data is to set the format of all the cells to text. Otherwise, a value larger in size or one with leading zeros might create a problem. Leading zeros get trimmed and a larger value does not get displayed properly in a cell or doesn’t get exported properly.

After the import process completes successfully, a file can be saved as a tab delimited file through MS Excel only. The tab delimited file can then be used by SQL Loader utility to load the data into ORACLE tables.

Processing data

Processing the data imported is one of the major steps in data conversion process. It’s quite possible that in a real life scenario, changes to the code might be required during the course of data conversion for a particular module in the procedures created for processing the data.

Creating separate versions of code

One way to protect and maintain the code written for a particular type of conversion is to write a new stored procedure. That’s the way it’s been mostly done while there have been changes that we were to make in our code during the process of conversion in Wave I. Maintaining separate versions of code like this is a tedious process especially without an integration of a version management tool with the development environment.

Example:

Here is an example of a data conversion process for a module (Sales Order), where there was a requirement to create a separate set of data for all the orders belonging to plant in Canada. In this kind of a scenario, we had created a separate procedure to process the records for Canadian plant. During this conversion process, there have been number of changes, which were done quite frequently to each of these two separate procedures and the nature of changes that were to be made were also different for each of the plants. In a real life scenario, it becomes very difficult to maintain separate versions and also keep track of the changes being made on continuous basis to these different sets of code.

Processed Data

Once the processing is complete, the data has to be delivered to the SAP.

Data Version

Versions of the data are maintained as the data goes into different environments in SAP. It’s required because of the following reasons:

  • Delta loads – When a delta load has to be sent for a particular version, it becomes important to know what all data has already been sent so that no duplicate records go into SAP
  • Identifying any incorrect data that might have been sent to a particular environment of SAP because of any reason. This helps in keeping track whether incorrect data was sent by Legacy, Staging or it something went wrong at SAP’s end only
 Generating Exception records reports

Generating an exception report is one of the most crucial steps in the process of data conversion. An exception report is a log of those records which could not be processed due to the business rules applied as per the functional specs.

An exception report helps the staging team in reporting the users/developers of the legacy system to identify the problems at their end and resolve them and re-send the data to the staging team to process it, so that the same could be uploaded in SAP too. A typical format of an exception report is a collection of raw data fields along with a reason as why those could not get processed. This is sent usually in an MS excel format.

Also, count of records processed and the records that came as part of the raw data is maintained and communicated to the business users/legacy system team to find out how much data got loaded and how much of it failed to load.


Best Practices in Regression Testing

Practice 1: Regression can be used for all types of releases

Regression testing can be applied when
  • We need to measure the quality of product between test cycles (both planned & need based);
  • We are doing a major release of a product, have executed all test cycles, and are planning a regression test cycle for defect fixes; and
  • We are doing a minor release of a product (support packs, patches, and so on) having only defect fixes, and we can plan for regression test cycles to take care of those defect fixes.
There can be multiple cycles of regression testing that can be planned for every release. This applies if defect fixes come in phase or to take care of some defect fixes not working with a specific build.

Practice 2: Mapping defect identifiers with test cases improves regression quality

When assigning a fail result to a test case during test execution, it is a good practice to enter the defect identifier(s) (from the defect tracking system along, so that you will know what test cases are to be executed) when a defect fix arrives. Please note that there can be multiple defects that can come out of a particular test case and a particular defect can affect more than one test case.

Even though ideally one would like to have a mapping between test cases and defects, the choice of test cases that are to be executed for taking care of side effects of defect fixes may still remain largely a manual process as this requires knowledge of the interdependences amongst the various defect fixes.

As the time passes by and with each release of the product, the size of the regression test cases to be executed grows. It has been found that some of the defects reported by customers in the past were due to last-minute defect fixes creating side effects. Hence, selecting the test case for regression testing is really an art and not that easy. To add to this complexity, most people want maximum returns with minimum investment on regression testing.

Practice 3: Create and execute regression test bed daily
To solve this problem, as and when there are changes made to a product, regression test cases are added or removed from an existing suite of test cases. This suite of test cases, called regression suite or regression test bed, is run when a new change is introduced to an application or a product. The automated test cases in the regression test bed can be executed along with nightly builds to ensure that the quality of the product is maintained during product development phases.

It was mentioned earlier that the knowledge of defect, product, their interdependences and a well-structured methodology are all very important to select test cases. These points stress the need for selecting the right person for the right job. The most experienced person in the team or the most talented person in the team may do a much better job of selecting the right test cases for regression than someone with less experience. Experience and talent can bring in knowledge of fragile areas in the product and impact the analysis of defects.

Practice 4: Ask your best test engineer to select the test cases
Strategy 1: The tiger has been put in a cage to prevent harm to human kind
Strategy 2: Some members of a family  lie inside the mosquito net as prevention
against mosquitoes.

Strategy1 has to be adopted for regression. Like the tiger in the cage, all defects in the product have to be identified and fixed. This is what “detecting defects in your product” means.

Strategy2 signifies “protecting your product from defects”. The strategy followed here is of prevention.

Practice 5: Detect defects, and protect your product from defects and defect fixes
Another aspect relating to regression testing is “protecting your product from defect fixes”. As discussed earlier, a defect that is classified as a minor defect may create a major impact on the product when it gets fixed into the code. It is similar to what a mosquito can do to humans (impact), even though its size is small. Hence, it is a good practice to analyze the impact of defect fixes, irrespective of size and criticality, before they are incorporated into the code. The analysis of an impact due to defect fixes is difficult due to lack of time and the complex nature of the product. Hence, it is a good practice to limit the amount of changes in the product when close to the release date. This will prevent the product from defects that may seep in through the defect fixes route, just as mosquitoes can get into the mosquito net through a small hole there. If you make a hole for a mosquito to get out of the net, it also opens the doors for new mosquitoes to come into the net. Fixing a problem without analyzing the impact can introduced a large number of defects in the product. Hence, it is important to insulate the product from defects as well as defect fixes.

If defects are detected and the product is protected from defects and defect fixes, then regression testing become effective and efficient. Regression testing, in effect, provides the mosquito net.