Boston University Test Plan Example1

From Dave's test site
Jump to: navigation, search

See also IEEE 829 Template for a different format

Note anything that would need more input from various resources is marked "tbd"

Contents

Overview

Introduction

The purpose of this test plan is to convey, across multiple stakeholders, an understanding of the requirements and various processes used to deliver the application described here: http://www.bu.edu/webcentral/about/exercise.pdf

Project Goals

The goal of this project is to create a seamless and well-designed process, accessible to the broadest possible base of users, which will enable submission of website launch and relaunch information via a web-based form.

Although out of scope for this project, a related end goal is that the user-submitted data will be used by a separate application which will let support staff respond to the individual requests, but it will also allow them to enter information about the website owners who submit the requests (and not the requests themselves). [this area may need cleanup/wordsmithing if to include]

User Scenarios

What real world user activities are you going to try to mimic?

  • User has created or updated a website and would like to submit information about themselves and the website in order to gain (visibility? inclusion in directory? tbd)

How will you attempt to mimic these key scenarios?

  • Simulated user environment testing
  • User acceptance/usability testing with focus group

Are there special niche markets that the application is aimed at (intentionally or unintentionally) where mimicking real user scenarios is critical?

  • tbd

End-User Profiles

What classes of users (i.e. secretaries, artist, writers, animators, construction worker, airline pilot, shoemaker, etc.) are expected to use this application?

  • Website publishers
  • tbd

Features

The application features a friendly user interface that allows the user to submit the following information:

  1. Name
  2. E-mail address
  3. Phone number
  4. URL of site
  5. Notes
  6. Launch or Relaunch?
    1. If Launch, Description of Site
    2. If Relaunch, Date of Original Launch and;
    3. If Relaunch, Description of Changes

A subsequent "Thank You" confirmation page is generated upon form submission.

Feature History

The original goal of this application is to:

  1. Collect the requested data in a valid format
  2. Use the collected data in a subsequently developed separate application that will let support staff not only respond to the individual requests, but also allow them to enter information about the website owners who submit the requests (and not the requests themselves).

References

Stakeholders

Role Name Email IM Phone Skype
Project Manager
Developer
Interdepartmental Liason
Quality Assurance

Human Resource Requirements

How many people will it require to implement these plans?

  • tbd (estimate 1-2 people depending on need and extent to which to include other dept's)

Are there currently enough people on staff right now? tbd What effect will hiring more or less people have (slip in schedule, quality, or something else)? tbd

External Dependencies

Are there any groups or projects external to the team that are dependent on this application?

  • Support staff
  • tbd

Any external dependencies that our application depends on? If so, What testing/development problems and issues does this create? How are deliverables from external groups going to be tested and confirmed with your own application? Who are the individuals serving as primary contact and liaison in this relationship? tbd

Communication Channels

List of any email aliases, bulletin boards, newsgroups, or other communication procedures and methodologies, and what they are used for. tbd

Decision Making Procedures

Who reviews decision points for the following sorts of things:

  1. Feature sign off
  2. Requirements change sign off
  3. Development design sign off
  4. Bug Triage
  5. Test Plan sign off

What is the process for the various decisions? E.g. what documents get updated, who gets notified? all tbd

Regular Meetings

For each meeting, who, when, where, what is the agenda, goals and expected output. tbd

Contingencies

Is there anything that may require testing plans to change and how will we react to those changes?

  1. The seperate application referred to in the Project Goal section, which is out of the scope of this document and project, and which hopes to use the collected data, may require or desire more data from the user causing design changes.
    1. These will be handled on an as needed basis with communication and verification of any changes, including any accompanying schedule changes, with all project stakeholders.

Specification/Requirements Review Procedures

Early in the development process, the specification will be reviewed by the development team and any questions or problems raised with the appropriate stakeholders. tbd

Test Strategy

Primary Testing Concerns

The primary testing focus will be on database and data entry functionality and user acceptance.

Primary Testing Focus

UI Testing, database verification testing (presentation layer, application layer, and data layer). User acceptance testing.

Scope of Test Coverage

Due to the lack of complexity of the application full test coverage with respect to #Functional Testing and #User Acceptance Testing is expected. #Non-functional Testing coverage is tbd and dependent on time and resource constraints. Unit testing tbd and based on existing methods.

Testing Schedule

Activity Type Responsible Party Status Start End
Initial Requirements Review Dave Forgianni Done 2009-03-12 2009-03-12
Create Test Plan Dave Forgianni Done 2009-03-17 2009-03-19
Create Test Cases tbd Not started n/a n/a

Include a pointer to any more detailed feature and team schedules here.

Key Issues

What are the top problems/issues that are recurring or remain open in this test plan? What problems remain unresolved? tbd

  • while building the app i found the conditional javascript to be a problem
  • user acceptance is a key issue
  • all the tbd's related to scheduling etc

Acceptance Criteria

How is "Good Enough To Ship" defined for the project? For the feature? What are the necessary performance, stability and bug find/fix rates to determine that the product is ready to ship? tbd

Test Approach

Test Tools

Here is a list of test tools that may be used, or need to be written, and for what purpose.

  1. Wiki for collaborative documentation evolution
  2. Test cases use spreadsheet application normally
  3. Test Case Repository (via wiki)
  4. Firebug/IE Dev toolbar to help identify and communicate browser issues
  5. Open source Selenium IDE used to automate testing
  6. Any performance data collection tools (db, http)
  7. Open source Load testing tool tbd

Test Methods

Smoke Testing (acceptance test, build verification, etc.)

Smoke tests, run by ... and stored in the Test Case Repository, will be used to determine whether or not a build is good enough to be submitted to testing.

An alternative or complement to this activity is to employ a continuous integration development and migration environment.

Manual Testing

What sorts of items will be tested manually rather than via automation?

  • Database accuracy

Why is manual testing being chosen over automation?

  • Dependent on available tool and/or resource availability for any custom test automation development

Where are the manual tests defined and located?

Automated Testing

What degree of automation will be used?

  • The UI testing can be automated, the db back end checking may need to be manual.
  • Performance testing may be semi-automated in regard to the collection of response time data
  • Load testing can be automated

What platform/tools will be used to write the automated tests?

  • Selenium IDE

What will the automation focus on?

  • Simulated user form submission
  • Multiple user, network and db load simulation

Where are the automated tools, suites and sources checked in?

Regression Testing

What is your general regression strategy?

  • If a specific test (not an exploratory test) is done once is can be documented, hopefully automated, and repeated to ensure bugs don't get re-opened in subsequent code evolution

Are you going to automate?

  • Where possible/tbd

Where are the regressions stored?

How often will they be re-run?

  • tbd

Test Cases

Test cases will be used to document and re-use specific paths and parameters and stored in a Test Case Repository.

Bug Reporting

What tool(s) will be used to report bugs?

  • Bugzilla/current tools

Where are the bug reports located?

  • link to tool w/ saved query

Are there any special pieces of information regarding categorization of bugs that should be reported here (areas, keywords, etc.)?

  • tbd

Bug Triage

What is your strategy for bug triage?

  • Bug triage meetings as needed

Functional Testing

Specification/Requirements Validation

How much will QA be involved in the early stage requirements/spec review?

  • As early as is deemed fit by the project stakeholders

Design Validation

Will testing review design? Is design an issue on this release? How much concern does testing have regarding design?

  • tbd

Data Validation

What types of data require validation?

  1. Name
    1. Required field
    2. Textbox input
    3. UI Validation: max length = ?; other?
    4. Database field name: DB constraints:
  2. E-mail address
    1. Required field
    2. Textbox input
    3. UI Validation: valid email address format; max length = ?; other?
    4. Database field name: DB constraints:
  3. Phone number
    1. Required field
    2. Textbox input(es?)
    3. UI Validation: max length = ?; format?
    4. Database field name: DB constraints:
  4. URL of site
    1. Required field
    2. Textbox input
    3. UI Validation: url format?; max length = ?; other?
    4. Database field name: DB constraints:
  5. Notes
    1. Not a required field
    2. Textarea input
    3. UI Validation: max length = ?;
    4. Database field name: DB constraints:
  6. Launch or Relaunch
    1. Required field
    2. Droplist/Radio button selection
    3. Validation: none
    4. Database field name: DB constraints:
      1. If Launch, Description of Site
        1. Conditionally Required field
        2. Textarea input
        3. Validation: maxlength?
        4. Database field name: DB constraints:
      2. If Relaunch, Date of original launch
        1. Conditionally Required field
        2. Textarea input
        3. Validation: maxlength?
        4. Database field name: DB constraints:
      3. If Relaunch, Description of Changes
        1. Conditionally Not required field
        2. Textarea input
        3. Validation: maxlength?
        4. Database field name: DB constraints:
  7. Submit button
    1. Submits form data and produces
      1. Thank you page (spec tbd)
  8. Reset/Clear Form button
    1. Clears all fields (including conditional)

All data listed needs to be verified in DB (see available queries/test cases)

Error Testing

How does the program handle error conditions?

  • tbd (can add to section above?)

What testing methodology will be used to evoke and determine proper behavior for error conditions? What feedback mechanism is being given to the user, and is it sufficient? What criteria will be used to define sufficient error recovery?

  • tbd based on any existing organizational standards

API Testing

What level of API testing will be performed?

  • If any database connectivity is desired by separate applications, that testing is currently out of scope for this project

What is the justification for taking this approach (only if none is being taken)?

  • Scope

Content Testing

Is your application content based? What is the nature of the content?

  • No, but there may be help files associated with the application that will require review.

What strategies will be employed to address content related issues?

  • Quality Assurance will be responsible for reviewing all help files, accompanying announcement or ongoing emails/publications/communications that are used.

Low-Resource Testing

What resources does your application use?

  • Browser based, end-user resource requirements tbd

Which are used most, and are most likely to cause problems? What tools/methods will be used in testing to cover low resource issues? tbd

Setup Testing

How is your application affected by end-uer setup? What are the necessary requirements for a successful end-user setup?

  • tbd/depends on methods used and end-user accessibility req's

What is the testing approach that will be employed to confirm valid setup of the feature?

  • Target user market research

Modes and Runtime Options

What are the different run time modes the program can be in? Are there views that can be turned off and on? Controls that toggle visibility states? Are there options a user can set which will affect the run of the program?

  • Beginning with user selection of feature 6 (Launch or Relaunch), the dynamic display of input fields may occur
  • This feature will need focussed testing
  • This feature will require compatibility testing

Interoperability

How will this product interact with other products?

  • Out of current scope

What level of knowledge does it need to have about other programs -- "good neighbor", program cognizant, program interaction, fundamental system changes?

  • n/a

What methods will be used to verify these capabilities?

  • n/a

Integration Testing

Integration with external application(s) is currently out of scope

Compatibility: Clients

This section needs further review or removal.

Is your feature a server based component that interacts with clients?

  • Yes

Is there a standard protocol that many clients are expected to use?

  • HTTP

How will you approach testing client compatibility?

  • tbd based on req's

Is your server suited to handle ill-behaved clients?

  • tbd

Are there subtleties in the interpretation of standard protocols that might cause incompatibilities?

  • tbd (e.g. javascript and browser compatibility)

Are there non-standard, but widely practiced use of your protocols that might cause incompatibilities?

  • tdb

Compatibility: Servers

This section needs further review or removal.

Is this application a client based component that interacts with servers?

  • yes

Is there a standard protocol supported by many servers that your client speaks? How many different servers will your client program need to support? How will you approach testing server compatibility? Is your client suited to handle ill-behaved or non-standard servers? Are there subtleties in the interpretation of standard protocols that might cause incompatibilities? Are there non-standard, but widely practiced use of protocols that might cause incompatibilities?

  • tbd

Environment/System - General

Are there issues regarding the environment, system, or platform that should get special attention in the test plan?

  • conditional form display and manipulation
  • browser compatibility

What are the run time modes and options in the environment that may cause differences in the feature?

  • conditional form display and manipulation

Are there platform or system specific compliance issues that must be maintained?

  • javascript/browser compat tbd based on req's

Configuration

Are there configuration issues regarding hardware and software in the environment that may get special attention in the test plan?

  • n/a or tbd

User Upgrades/Updates

How will we handle user having a previous/future versions of any required software?

  • any flash/browser issues here

Non-functional Testing

Performance & Capacity Testing

How fast and how much can the feature do? Does it do enough fast enough? What testing methodology will be used to determine this information?

  • req's tbd; recording of performance stats using available open source tools

What criterion will be used to indicate acceptable performance? If modifications of an existing product, what are the current metrics? What are the expected major bottlenecks and performance problem areas on this feature?

  • tbd based on technologies chosen

Scalability

Is the ability to scale and expand this feature a major requirement?

  • tbd

What parts of the feature are most likely to have scalability problems?

  • server load
  • db size and load

What approach will testing use to define the scalability issues in the feature?

  • tdb on tech choices

Stress Testing

How does the feature do when pushed beyond its performance and capacity limits? How is its recovery? What is its breakpoint? What is the user experience when this occurs? What is the expected behavior when the client reaches stress levels? What testing methodology will be used to determine this information? What area is expected to have the most stress related problems?

  • tbd

Volume Testing

Volume testing differs from performance and stress testing in so much as it focuses on doing volumes of work in realistic environments, durations, and configurations. Run the software as expected user will - with certain other components running, or for so many hours, or with data sets of a certain size, or with certain expected number of repetitions.

  • tbd

Robustness

How stable is the code base? Does it break easily? Are there memory leaks? Are there portions of code prone to crash, save failure, or data corruption? How good is the program's recovery when these problems occur? How is the user affected when the program behaves incorrectly? What is the testing approach to find these problem areas? What is the overall robustness goal and criteria?

  • tbd

Boundaries and Limits

Are there particular boundaries and limits inherent in the feature or area that deserve special mention here? What is the testing methodology to discover problems handling these boundaries and limits?

  • tbd

Special Code Profiling and Other Metrics

How much focus will be placed on code coverage? What tools and methods will be used to measure the degree to which testing coverage is sufficiently addressing all of the code?

  • tbd

User Acceptance Testing

Accessibility

Is the feature designed in compliance with accessibility guidelines? Could a user with special accessibility requirements still be able to utilize this feature? What is the criteria for acceptance on accessibility issues on this feature? What is the testing approach to discover problems and issues? Are there particular parts of the feature that are more problematic than others?

  • tbd

International Issues

Confirm localized functionality, that strings are localized and that code pages are mapped properly. Assure program works properly on localized builds, and that international settings in the program and environment do not break functionality. How is localization and internationalization being done on this project? List those parts of the feature that are most likely to be affected by localization. State methodology used to verify International sufficiency and localization.

  • tbd

Usability

What are the major usability issues on the feature? What is testing's approach to discover more problems? What sorts of usability tests and studies have been performed, or will be performed? What is the usability goal and criteria for this feature?

  • unknown/tbd

User Interface

List the items in the feature that explicitly require a user interface... Is the user interface designed such that a user will be able to use the feature satisfactorally?

  • tbd

Which part of the user interface is most likely to have bugs?

  • conditional dynamic form display

How will the interface testing be approached?

  • automated, manual

Beta Testing

What is the beta schedule? What is the distribution scale of the beta? What is the entry criteria for beta? How is testing planning on utilizing the beta for feedback on this feature? What problems do you anticipate discovering in the beta? Who is coordinating the beta, and how?

  • tbd

Test Environment

What are the requirements for the test environment setup?

  • tbd based on req's

Operating Systems

Networks

  • tcp/ip

Hardware

Identify the various hardware platforms and configurations.

  1. User Machines
    1. tbd
  1. Server Machines
    1. list tbd

Software

Identify software included with the product or likely to be used in conjunction with this product. Software categories would include memory managers, extenders, some TSRs, related tools or products, or similar category products.

  1. User Machines
    1. tbd
  1. Server Machines
    1. list tbd

Operational Issues

Operational Problem Escalation/Alert Methods

Is the program being monitored/maintained by an operational staff? Are there special problem escalation, or operational procedures for dealing with the problems after release?

  • tbd

Product Support

What aspects of this feature have been a problem for support in the past? How are those problems being addressed? What aspects of this feature will likely cause future support problems? How are those problems being resolved? What testing methodology is being used to prevent future support problems? How is information being sent to support regarding problems and functionality of the feature?

  • tbd

Files, Modules and Setup

Include in this section any files, modules and code that must be distributed on the machines, and where they would be located. Also include setup procedures, de-installation procedures, special database and utility setups, and any other relevant data.

Files List

  1. ReadMe.txt
  2. CreateDbAndTables.sql
  3. insert.php
  4. gen_validatorv31.js
  5. submitsite.html
  6. TestNotes.txt

Download SampleApplication.zip

Setup Procedures

Prerequisites

  1. Web server installed
  2. PHP installed
  3. MySQL installed

Setup

  1. Open the file CreateDbAndTables.sql and execute the commands in the specified order
  2. Edit the MySQL connection information in the file "insert.php" on line 2 to be appropriate and save
  3. Copy these files into a web directory:
    1. gen_validatorv31.js
    2. submitsite.html
    3. insert.php
  4. Open http://localhost/submitsite.html in a web browser

De-installation Procedures

  • tbd

Database Setup and Procedures

  • see above

Network Domain/Topologies Configuration Procedures

  • tbd

Code Migration

Is there data, script, code or other artifacts from previous versions that will need to be migrated to a new version? Testing should create an example of installation with an old version, and migrate that example to the new version, moving all data and scripts into the new format.

List here all data files, formats, or code that would be affected by migration, the solution for migration, and how testing will approach each.

Code from Development to Testing Procedures

Define the process for handing off the code between Development and Testing.

Code from Testing to Release Procedures

Define the process for releasing code from Staging to Production.

Monitoring

Does the service have adequate monitoring messages to indicate status, performance, or error conditions? When something goes wrong, are messages sufficient for operational staff to know what to do to restore proper functionality? Are the "hearbeat" counters that indicate whether or not the program or service is working? Attempt to mimic the scenario of an operational staff trying to keep a service up and running.

  • tbd

Performance Monitoring Counters Setup And Configurations

  1. User analytics tracking code/cookies? tbd
  2. DB/Application server performance monitoring? tbd

Disaster Planning

Backup

Identify all files representing data and machine state, and indicate how those will be backed up. If it is imperative that service remain running, determine whether or not it is possible to backup the data and still keep services or code running.

  1. Will there be regular database backups performed?
  2. Where will the backups be stored, who has access to them, and under what conditions is a restoration applicable?

Recovery

If the program goes down, or must be shut down, are there steps and procedures that will restore program state and get the program or service operational again? Are there holes in this process that may make a service or state deficient? Are there holes that could provide loss of data. Mimic as many states of loss of services that are likely to happen, and go through the process of successfully restoring service.

  1. Where will the backups be stored, who has access to them, and under what conditions is a restoration applicable?

Archiving

Archival is different from backup. Backup is when data is saved in order to restore service or program state. Archive is when data is saved for retrieval later. Most archival and backup systems piggy-back on each other's processes.

Is archival of data going to be considered a crucial operational issue on your feature? If so, is it possible to archive the data without taking the service down? Is the data, once archived, readily accessible?

Notes