Monday, October 14, 2013

Changes to Testing Methods for SOA


The adoption of SOA for agility, reducing lifetime cost of an application or accelerating time to market for new business features, does require a change in your current testing strategy. These changes will reduce the amount of time it takes to test, enabling you to move faster through test cycles, and will require new activities (e.g., automated continuous regression testing). Complete testing of all possible paths of a program, an application, or system has become impractical if not cost-prohibitive for organizations (and has been for some time). There are just too many paths through a program to test; Yet, defects exist, and organizations must do intelligent testing and full life cycle testing such that defects are identified. For many test teams or test groups, deploying business applications into production with defects is way of life, where defects are now categorized and workaround techniques are communicated in lieu of fixing the defects. It does not have to be this way.
SOA provides an opportunity to improve testing when using services. The use of services promotes black-box testing, where functional tests can focus on valid inputs and outputs. The service contract defines valid inputs and expected outputs of a service, which promotes the use of black-box testing and tools to test the service. Blackbox testing eliminates the need to perform complete testing of all possible paths of a program, application, or system. In most environments, testing of a changed or newly deployed application requires retesting of components in the new system and their connections to downstream systems. This testing cycle is fraught with errors. Often, it must be performed in a linear fashion, and calendar time is slowly eaten away, causing a decision to either delay deployment, turn off features, or simply to live with defects in a production system (where operator workarounds replace functioning software). SOA fixes this issue because a deployed and working service does not need to be retested or included in future test cycles when reusing the service and deploying a new applications that uses the production-deployed service.



Figure shows the difference of scope with business applications before and after adopting SOA. In Figure, the scope of what needs to be tested is larger before SOA. The scope of the test cycle is less with the use of services when a service life cycle is introduced, where each service goes through a testing cycle in the same manner as we treat applications. Just like applications, services do not have to be retested after they are deployed into production. Services can be certified as satisfying their contract specifications. This certification provides consumer’s confidence that the service works as designed and results in less overall testing when applications are structured using services.
      With SOA adoption, where services are the structuring element of the application, black-box testing becomes the norm. This avoids the retesting of services already deployed in production to determine whether the new system, application, or service has a bug, because services working in production continue to accept the proper inputs and deliver the correct outputs according to the service contract. SOA testing is facilitated when automated continuous regression testing is used. Regression testing facilitates the rapid testing of services as black boxes to ensure the service contracts and services work as planned. Test drivers also prove useful for enabling the consumer to test the provider service in a controlled environment and without always having the platform of the service provider available. Service virtualization testing products enable test teams to mimic the functionality of a service based on its contract design before the implementation is developed, which further improves the quality of the service. 


Friday, September 27, 2013

Database Testing using HP-UFT and CA-LISA

CA-iTKO LISA - Database Testing Tool

   It is considered always better to test the data and perform database validation. Due to the defects in the store procedures, which corrupt the data, truncate or dump the data in incorrect column of the table. When we try to view the same from GUI, it fails as the information is incorrect or sometime even not available. Many time the test team had to go to the database level to perform the database validation. This requires the tester to have knowledge and understanding of database schema and how to create and execute SQL queries. If you are doing database validation you must be aware of what kind of pain it is to create and maintain those scripts. Each time the values are to be pulled from screens, XML and manually passed to SQL queries. For few records its feasible, but imagine if you have to validate thousands of records and that to on regular basis. Recently I had an opportunity to check the features of HP UFT. UFT can be utilized to perform the database validation but the cost involved is very high, but if you already have UFT or LISA below is the brief of how their approach are to achieve this.
 HP-UFT:
Pre-requisite for UFT is to install database client. Eg. If you are using Oracle as your database, first you will have to install Oracle Client on the system where you UFT is available. This is required to connect to the database and later execute SQL queries.
Step 1. Start UFT and create sample test case. On the canvas, follow step are required to connect to database
Step 2. Database -> Open Connection
Step 3. Database -> Select Data
Step 4. Database -> Execute Command
These are the steps which are required each time. For validation there are no in-build assertions which can be utilized. You will have to rely on checkpoint and will have to parameterize the expected value.
Step 5. To pull the records from database follow steps 1 & 2. Use write to file step to dump the records to file. Here all the records are written in tags eg. '<'ROW'>''<'IDENTIFY_DATA'>''' To avoid this you will have to do some scripting. If you wish to perform validation on the values from this file again it’s difficult as the file is not an XML file and the data is not arranged properly
Problem:
Difficult to add database validation.
Mapping of external data is point to point and takes effort to update / enhance if required
When data is pulled it is in incorrect XML format which is difficult to validate or use in other test cases
 CA-LISA
Pre-requisite for LISA is to add the database .jar file in LISA lib/hot deploy folder.
Step 1. In LISA add JDBC step.
        a. Select / Add JDBC driver from the dropdown. This list is available because you have added .jar file of the database
        b. Enter the connection string in the text box
        c. Enter User name & password
        d. Enter the SQL query which you want to execute
        e. You are now ready to add filters and assertions on the result set.
To pull the records from database simple add filter 'Write Properties to File' to the same step and all the values will be available in the mention file.
If you compare both the approach of the tools you will notice that in LISA you can achieve the same in just 1 step.
Problem: Only problem observed is to get the correct .jar file for database connectivity.
    Overall compared to above mentioned two tool, database validation using LISA seems to be far easy. The same can be used as test data for other test cases. As LISA framework is built on properties it is very easy to use the captures values through properties in complete LISA project.

You are welcome to provide your thoughts on the above.