|
Collegiate Sports Paging System Test Plan Version 1.0 Revision History
Table of Contents
Introduction
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| Document (including version and date) |
Created or Available | Received or Reviewed | Author or Resource | Notes |
|---|---|---|---|---|
| Vision Document | Yes | Yes | Context Integration | |
| Supplemental Specification | Yes | Yes | Context Integration | |
| Use Case Reports | Yes | Yes | Context Integration | |
| Project Plan | Yes | Yes | Context Integration | |
| Design Specifications | No | No | ||
| Prototype | Yes | Yes | Context Integration | |
| Project / Business Risk Assessment | Yes | Yes | Context Integration |
The listing below identifies those items (use cases, functional requirements, non-functional requirements) that have been identified as targets for testing. This list represents what will be tested.
Verify that subscriber information can be entered and retrieved.
Verify that content and categories can be inserted and displayed.
Verify that advertiser profiles and account information can be entered and displayed.
Verify that subscriber-specific usage information is tracked.
Verify that subscribers see the information for which they have requested paging.
Verify that pages go to subscribers when content arrives.
Verify that automatic content insertion works.
Verify that editor approval causes non-automatic content to be inserted.
Verify that subscribers who have lapsed subscriptions do not receive pages.
Verify that content marked as archived is not re-displayed to subscribers.
Verify that obsolete content is deleted.
Verify that advertiser reports are accurate.
Verify that advertiser reports can be received in Microsoft® Word®, Microsoft® Excel ®, or HTML.
None.
Navigate through all use cases, verifying that each UI panel can be easily understood
Verify all online Help functions
Verify that all screens conform to the WebNewsOnLine standards.
Verify response time of interface to Pager Gateway system.
Verify response time of interface from existing WebNewsOnLine web server.
Verify response time when connected using 56Kbps modem.
Verify response time when connected locally (on the same LAN).
Verify system response with 200 concurrent subscribers.
Verify system response with 500 concurrent subscribers.
Verify system response with 1,000 concurrent subscribers.
Verify system response with 5,000 concurrent subscribers.
Verify system response with 10,000 concurrent subscribers.
Verify system response with 50,000 concurrent subscribers.
Verify system response with 100,000 concurrent subscribers.
Verify system response with 200,000 concurrent subscribers.
None.
Verify pages sent out within 5 minutes when single content element arrives.
Verify pages sent out within 5 minutes when content arrives every 20 seconds.
Verify that non-subscribers cannot access subscriber-only information.
Verify that non-editors can not approve content.
Verify that advertisers see only their own advertising content.
None.
Verify operation using Netscape V4.x browser.
Verify operation using Microsoft® Internet Explorer® V5.x
None.
| Test Objective: | Ensure database access methods and processes function properly and without data corruption. |
|---|---|
| Technique: |
|
| Completion Criteria: | All database access methods and processes function as designed and without any data corruption. |
| Special Considerations: |
|
| Test Objective: | Ensure proper target-of-test functionality, including navigation, data entry, processing, and retrieval. |
|---|---|
| Technique: | Execute each use case, use case flow, or
function, using valid and invalid data, to verify the following:
|
| Completion Criteria: |
|
| Special Considerations: | None. |
| Test Objective: | Verify the following:
|
|---|---|
| Technique: | Create or modify tests for each window to verify proper navigation and object states for each application window and objects. |
| Completion Criteria: | Each window successfully verified to remain consistent with benchmark version or within acceptable standard |
| Special Considerations: | Not all properties for custom and third party objects can be accessed. |
| Test Objective: | Verify performance behaviors for designated
transactions or business functions under the following conditions:
|
|---|---|
| Technique: | Use Test Procedures developed for Function or
Business Cycle Testing.
Modify data files (to increase the number of transactions) or the scripts to increase the number of iterations each transaction occurs. Scripts should be run on one machine (best case to benchmark single user, single transaction) and be repeated with multiple clients (virtual or actual, see special considerations below). |
| Completion Criteria: | Single Transaction or single user: Successful
completion of the test scripts without any failures and within the
expected or required time allocation (per transaction)
Multiple transactions or multiple users: Successful completion of the test scripts without any failures and within acceptable time allocation. |
| Special Considerations: | Comprehensive performance testing includes
having a "background" workload on the server.
There are several methods that can be used to perform this, including:
Performance testing should be performed on a dedicated machine or at a dedicated time. This permits full control and accurate measurement. The databases used for Performance testing should be either actual size, or scaled equally. |
| Test Objective: | Verify performance behaviors time for designated transactions or business cases under varying workload conditions. |
|---|---|
| Technique: | Use tests developed for Function or Business
Cycle Testing.
Modify data files (to increase the number of transactions) or the tests to increase the number of times each transaction occurs. |
| Completion Criteria: | Multiple transactions or multiple users: Successful completion of the tests without any failures and within acceptable time allocation. |
| Special Considerations: | Load testing should be performed on a dedicated
machine or at a dedicated time. This permits full control and accurate
measurement.
The databases used for load testing should be either actual size, or scaled equally. |
| Test Objective: | Verify that the target-of-test successfully
functions under the following high volume scenarios:
|
|---|---|
| Technique: | Use tests developed for Performance Profiling
or Load Testing.
Multiple clients should be used, either running the same tests or complementary tests, to produce the worst case transaction volume or mix (see stress test above) for an extended period. Maximum database size is created (actual, scaled, or filled with representative data) and multiple clients used to run queries and report transactions simultaneously for extended periods. |
| Completion Criteria: | All planned tests have been executed and specified system limits are reached or exceeded without the software or software failing. |
| Special Considerations: | What period of time would be considered an acceptable time for high volume conditions (as noted above)? |
| Test Objective: | Application-level Security: Verify that
an actor can access only those functions and data for which their
user type is provided permissions.
System-level Security: Verify that only those actors with access to the system and applications are permitted to access them. |
|---|---|
| Technique: | Application-level: Identify and list each
actor type and the functions or data each type has permissions for.
Create tests for each actor type and verify each permission by creating transactions specific to each user actor. Modify user type and re-run tests for same users. In each case verify those additional functions and data are correctly available or denied. System-level Access (see special considerations below) |
| Completion Criteria: | For each known actor type, the appropriate function and data are available, and all transactions function as expected and run in prior function tests |
| Special Considerations: | Access to the system must be reviewed or discussed with the appropriate network or systems administrator. This testing may not be required as it maybe a function of network or systems administration. |
| Test Objective: |
Verify that the target-of-test functions properly on the required hardware and software configurations. |
|---|---|
| Technique: |
Use Function Test scripts Open or close various non-target-of-test related software, such as the Microsoft applications Excel® and Word®, either as part of the test or prior to the start of the test. Execute selected transactions to simulate actor's interacting with the target-of-test and the non-target-of-test software Repeat the above process, minimizing the available conventional memory on the client. |
| Completion Criteria: |
For each combination of the target-of-test and non-target-of-test software, all transactions are successfully completed without failure. |
| Special Considerations: |
What non-target-of-test software is needed, is available, accessible on the desktop? What applications are typically used? What data are the applications running (that is, large spreadsheet opened in Excel, 100 page document in Word)? The entire systems, netware, network servers, databases, and so forth should also be documented as part of this test. |
The following tools will be employed for this project:
| Tool |
Version |
|
|---|---|---|
| Defect Tracking |
Project HomePage |
|
| Project Management |
Microsoft® Project® |
This section presents the recommended resources for the Collegiate Sports Paging System test effort, their main responsibilities, and their knowledge or skill set.
This table shows the staffing assumptions for the project.
| Human Resources | ||
|---|---|---|
| Worker | Minimum Resources Recommended | Specific Responsibilities and Comments |
| Test Manager, Test Project Manager |
1 ( Collegiate Sports Paging System project manager) | Provides management oversight
Responsibilities:
|
| Test Designer | 1 | Identifies, prioritizes, and implements test
cases
Responsibilities:
|
| Tester | 4 (provided by WebNewsOnLine) | Executes the tests
Responsibilities:
|
| Test System Administrator | 1 | Ensures test environment and assets are managed
and maintained.
Responsibilities:
|
| Database Administration / Database Manager | 1 (provided by WebNewsOnLine) | Ensures test data (database) environment and
assets are managed and maintained.
Responsibilities:
|
| Designer | 2 | Identifies and defines the operations,
attributes, and associations of the test classes
Responsibilities:
|
| Implementer | 4 | Implements and unit tests the test classes and
test packages
Responsibilities:
|
The following table sets forth the system resources for the testing project.
The specific elements of the test system are not fully known at this time. It is recommended that the system simulate the production environment, scaling down the accesses and database sizes, if and where appropriate.
| System Resources | |
|---|---|
| Resource | Name and Type |
| Database Server | |
| Network/Subnet | TBD |
| Server Name | TBD |
| Database Name | TBD |
| Client Test PCs | |
| Include special configuration -requirements |
TBD |
| Test Repository | |
| Network/Subnet | TBD |
| Server Name | TBD |
| Test Development PCs | TBD |
| Milestone Task | Effort | Start Date | End Date | ||
|---|---|---|---|---|---|
| Plan Test | |||||
| Design Test | |||||
| Implement Test | |||||
| Execute Test | |||||
| Evaluate Test |
For each test executed, a test result form will be created. This will include the name or ID of the test, the use case or supplemental specification to which the test relates, the date of the test, the ID of the tester, required pre-test conditions, and results of the test.
Microsoft Word will be used to record and report test results.
Defects will be recorded using the Project HomePage using the Web.
The following table lists the test related tasks.
| Plan Test |
| Identify Requirements for Test |
| Assess Risk |
| Develop Test Strategy |
| Identify Test Resources |
| Create Schedule |
| Generate Test Plan |
| Design Test |
| Workload Analysis |
| Identify and Describe Test Cases |
| Identify and Structure Test Procedures |
| Review and Access Test Coverage |
| Implement Test |
| Record or Program Test Scripts |
| Identify Test-Specific Functionality in the Design and Implementation Model |
| Establish External Data Sets |
| Execute Test |
| Execute Test Procedures |
| Evaluate Execution of Test |
| Recover from Halted Test |
| Verify the Results |
| Investigate Unexpected Results |
| Log Defects |
| Evaluate Test |
| Evaluate Test-Case Coverage |
| Evaluate Code Coverage |
| Analyze Defects |
| Determine if Test Completion Criteria and Success Criteria Have Been Achieved |
Copyright 1987 - 2003 Rational Software Corporation