Home >> Case Studies >> Content Management and Work Flow Automation Web Software Testing

Content Management and Work Flow Automation Web Software Testing

Client:
The client develops and markets solutions for traditionally chaotic processes used to transfer e-business and eCommerce related information and applications to multiple audiences via the Internet (a large number of users geographically distributed). The client's key products include workflow, content contribution and management, security, versioning and editions, personalization, profile-based viewing, eCommerce, integration and dynamic delivery. These features help meet the on-going challenge of efficiently gathering, managing and delivering site information destined for business-to-business, Intranet/Extranet and eCommerce use.

Team:
A 6 member test team (technical leader, designers, testers) located at the lab in New Delhi, India. One (1) Project Manager located at the Silicon Valley, CA, lab with support from Senior Management Team in Raleigh, NC.

Software Technologies and Testing Tools:
J2EE environment (JSP, Servlets, JDBC, JavaBeans, etc.), XML, Oracle8i, multiple Application and Web Servers. Internationalization is part of the project as well.

Test & Bug Tracking Tools: Segue SilkTest, SilkPerformer, Mercury LoadRunner, Rational Clear DDTS, Microsoft Visual Source Safe, Test Automation using Scripting, etc.

Duration:
Q3-2000 to Q4-2001

A Brief description:
Starting from Fall 2000, we have been testing multiple releases of a very complex multi-tier Java Centric Client-Server Application. The product is an XML-enabled content management and workflow solution created to support ongoing Internet, Intranet and Extranet requirements of an enterprise. Our client is a leader in content management and workflow automation in the e-Commerce Industry for more than a decade.

Using SEI guidelines, we have managed and tested multiple releases so far. We have used our experience in system development and testing to provide a continuous stream of test cases designs, scripts and test reports to help the timely launch of products.

Specifically, we have designed test cases, written protocols and tested the software from the following perspectives:

1)      Top-level Usage Scenarios Testing:
This involves designing test cases and testing the software from the perspective of business use cases. The system provides different levels of permissions and activity for different users such as administrators, project managers, web content developers, technical writers, etc. It tests the software from the perspective of daily tasks performed by different users after they log in to the system.

2)      In-depth Feature/Functionality Testing:
This involves developing test cases and testing detailed functionality in each module of the software. More than 10,000 test cases were generated and tested for each release.

3)      Compatibility Testing:
This involves testing of installation and operation of the software on multiple Application and Web Servers, Operating Systems, Browsers (and combinations thereof).

4)      Performance Testing:
This is for testing the number of transactions and number of users the system can handle simultaneously. It includes stress testing the software and also the hardware load testing.

This involves simulation of multiple users with a range of connection speeds hitting multiple pages of the web site, simultaneously. This helps identify problems with scalability, performance bottlenecks, and helps in capacity planning.

5)      Domain/ Range Testing:
This involves testing the lower/upper limits for parameters in various features and functionality. For example, since the software involves content management, it is critical to ensure that the system can handle a specified size of images, video, audio, HTML documents, etc. as attachments.

6)      Regression testing:
Working under strict configuration management rules, regression testing was performed to ensure correction of anomalies as well as uncover any new issues that bug fixes may have generated in successive builds.

In addition, each build submitted to us underwent Smoke Testing, often uncovering some major glitches before upgrading the rest of the testing team.

<< back

[an error occurred while processing this directive]