IEEE ISO IEC 29119 3 2013
$81.25
Software and systems engineering Software testing Part 3:Test documentation
Published By | Publication Date | Number of Pages |
IEEE | 2013 | 138 |
New IEEE Standard – Active. The purpose of the ISO/IEC/IEEE 29119 series of software testing standards is to define an internationally-agreed set of standards for software testing that can be used by any organization when performing any form of software testing. ISO/IEC/IEEE 29119-3 includes templates and examples of test documentation. The templates are arranged within clauses reflecting the overall test process description structure in ISO/IEC/IEEE 29119-2, i.e. by the test process in which they are being produced. Annex A contains outlines of the contents of each document. Annex B contains mappings ISO/IEC/IEEE 29119-2. Annex C contains an overview of the examples. Annexes D to S contain examples of the application of the templates. Annex T provides mappings to existing standards. The Bibliography for this part of ISO/IEC/IEEE 29119 is at the end of the document. ISO/IEC/IEEE 29119-3 supports dynamic testing, functional and non-functional testing, manual and automated testing, and scripted and unscripted testing. The documentation templates defined in ISO/IEC/IEEE 29119-3 can be used in conjunction with any software development lifecycle model.
PDF Catalog
PDF Pages | PDF Title |
---|---|
1 | ISO/IEC/IEEE 29119-3 Front Cover |
3 | CONTENTS |
7 | Foreword |
8 | Introduction |
9 | 1 Scope |
11 | 2 Conformance 2.1 Intended usage 2.2 Types of conformance 2.2.1 Full Conformance 2.2.2 Tailored Conformance |
12 | 3 Normative references 4 Terms and definitions |
17 | 5 Organizational Test Process Documentation 5.1 Overview 5.2 Test Policy 5.2.1 Overview 5.2.2 Document specific information 5.2.2.1 Overview 5.2.2.2 Unique identification of document 5.2.2.3 Issuing organization 5.2.2.4 Approval authority 5.2.2.5 Change history |
18 | 5.2.3 Introduction 5.2.3.1 Scope 5.2.3.2 References 5.2.3.3 Glossary 5.2.4 Test policy statements 5.2.4.1 Objectives of testing 5.2.4.2 Test process 5.2.4.3 Test organization structure 5.2.4.4 Tester training 5.2.4.5 Tester ethics |
19 | 5.2.4.6 Standards 5.2.4.7 Other relevant policies 5.2.4.8 Measuring the value of testing 5.2.4.9 Test asset archiving and reuse 5.2.4.10 Test process improvement 5.3 Organizational Test Strategy 5.3.1 Overview |
20 | 5.3.2 Document specific information 5.3.2.1 Overview 5.3.2.2 Unique identification of document 5.3.2.3 Issuing organization 5.3.2.4 Approval authority |
21 | 5.3.2.5 Change history 5.3.3 Introduction 5.3.3.1 Scope 5.3.3.2 References 5.3.3.3 Glossary 5.3.4 Project-wide organizational test strategy statements 5.3.4.1 Generic risk management 5.3.4.2 Test selection and prioritization 5.3.4.3 Test documentation and reporting |
22 | 5.3.4.4 Test automation and tools 5.3.4.5 Configuration management of test work products 5.3.4.6 Incident management 5.3.4.7 Test sub-processes 5.3.5 Test sub-process-specific organizational test strategy statements 5.3.5.1 Entry and exit criteria 5.3.5.2 Test completion criteria 5.3.5.3 Test documentation and reporting 5.3.5.4 Degree of independence |
23 | 5.3.5.5 Test design techniques 5.3.5.6 Test environment 5.3.5.7 Metrics to be collected 5.3.5.8 Retesting and regression testing 6 Test Management Processes Documentation 6.1 Overview 6.2 Test Plan 6.2.1 Overview |
24 | 6.2.2 Document specific information 6.2.2.1 Overview 6.2.2.2 Unique identification of document 6.2.2.3 Issuing organization 6.2.2.4 Approval authority 6.2.2.5 Change history 6.2.3 Introduction 6.2.3.1 Scope 6.2.3.2 References |
25 | 6.2.3.3 Glossary 6.2.4 Context of the testing 6.2.4.1 Project(s) / test sub-process(es) 6.2.4.2 Test item(s) 6.2.4.3 Test scope 6.2.4.4 Assumptions and constraints 6.2.4.5 Stakeholders 6.2.5 Testing communication |
26 | 6.2.6 Risk register 6.2.6.1 Product risks 6.2.6.2 Project risks 6.2.7 Test strategy 6.2.7.1 Test sub-processes 6.2.7.2 Test deliverables |
27 | 6.2.7.3 Test design techniques 6.2.7.4 Test completion criteria 6.2.7.5 Metrics to be collected 6.2.7.6 Test data requirements 6.2.7.7 Test environment requirements 6.2.7.8 Retesting and regression testing 6.2.7.9 Suspension and resumption criteria |
28 | 6.2.7.10 Deviations from the Organizational Test Strategy 6.2.8 Testing activities and estimates 6.2.9 Staffing 6.2.9.1 Roles, activities, and responsibilities 6.2.9.2 Hiring needs 6.2.9.3 Training needs 6.2.10 Schedule |
29 | 6.3 Test Status Report 6.3.1 Overview 6.3.2 Document specific information 6.3.2.1 Overview 6.3.2.2 Unique identification of document 6.3.2.3 Issuing organization 6.3.2.4 Approval authority 6.3.2.5 Change history 6.3.3 Introduction |
30 | 6.3.3.1 Scope 6.3.3.2 References 6.3.3.3 Glossary 6.3.4 Test status 6.3.4.1 Reporting period 6.3.4.2 Progress against Test Plan 6.3.4.3 Factors blocking progress 6.3.4.4 Test measures 6.3.4.5 New and changed risks 6.3.4.6 Planned testing |
31 | 6.4 Test Completion Report 6.4.1 Overview 6.4.2 Document specific information 6.4.2.1 Overview 6.4.2.2 Unique identification of document 6.4.2.3 Issuing organization 6.4.2.4 Approval authority 6.4.2.5 Change history 6.4.3 Introduction 6.4.3.1 Scope |
32 | 6.4.3.2 References 6.4.3.3 Glossary 6.4.4 Testing performed 6.4.4.1 Summary of testing performed 6.4.4.2 Deviations from planned testing 6.4.4.3 Test completion evaluation 6.4.4.4 Factors that blocked progress 6.4.4.5 Test measures 6.4.4.6 Residual risks 6.4.4.7 Test deliverables |
33 | 6.4.4.8 Reusable test assets 6.4.4.9 Lessons learned 7 Dynamic Test Processes Documentation 7.1 Overview |
34 | 7.2 Test Design Specification 7.2.1 Overview 7.2.2 Document specific information 7.2.2.1 Overview 7.2.2.2 Unique identification of document 7.2.2.3 Issuing organization 7.2.2.4 Approval authority 7.2.2.5 Change history 7.2.3 Introduction 7.2.3.1 Scope |
35 | 7.2.3.2 References 7.2.3.3 Notation convention(s) 7.2.3.4 Glossary 7.2.4 Feature sets 7.2.4.1 Overview 7.2.4.2 Unique identifier 7.2.4.3 Objective 7.2.4.4 Priority 7.2.4.5 Specific strategy |
36 | 7.2.4.6 Traceability 7.2.5 Test conditions 7.2.5.1 Overview 7.2.5.2 Unique identifier 7.2.5.3 Description 7.2.5.4 Priority 7.2.5.5 Traceability |
37 | 7.3 Test Case Specification 7.3.1 Overview 7.3.2 Document specific information 7.3.2.1 Overview 7.3.2.2 Unique identification of document 7.3.2.3 Issuing organization 7.3.2.4 Approval authority 7.3.2.5 Change history 7.3.3 Introduction 7.3.3.1 Scope |
38 | 7.3.3.2 References 7.3.3.3 Notation convention(s) 7.3.3.4 Glossary 7.3.4 Test coverage items 7.3.4.1 Overview 7.3.4.2 Unique identifier 7.3.4.3 Description |
39 | 7.3.4.4 Priority 7.3.4.5 Traceability 7.3.5 Test cases 7.3.5.1 Overview 7.3.5.2 Unique identifier 7.3.5.3 Objective 7.3.5.4 Priority 7.3.5.5 Traceability 7.3.5.6 Preconditions |
40 | 7.3.5.7 Inputs 7.3.5.8 Expected results 7.3.5.9 Actual results and test result 7.4 Test Procedure Specification 7.4.1 Overview |
41 | 7.4.2 Document specific information 7.4.2.1 Overview 7.4.2.2 Unique identification of document 7.4.2.3 Issuing organization 7.4.2.4 Approval authority 7.4.2.5 Change history 7.4.3 Introduction 7.4.3.1 Scope 7.4.3.2 References 7.4.3.3 Notation convention(s) |
42 | 7.4.3.4 Glossary 7.4.4 Test sets 7.4.4.1 Overview 7.4.4.2 Unique identifier 7.4.4.3 Objective 7.4.4.4 Priority 7.4.4.5 Contents (Traceability) 7.4.5 Test procedures 7.4.5.1 Overview |
43 | 7.4.5.2 Unique identifier 7.4.5.3 Objective 7.4.5.4 Priority 7.4.5.5 Start up 7.4.5.6 Test cases to be executed (Traceability) 7.4.5.7 Relationship to other procedures 7.4.5.8 Stop and wrap up 7.5 Test Data Requirements 7.5.1 Overview |
44 | 7.5.2 Document specific information 7.5.2.1 Overview 7.5.2.2 Unique identification of document 7.5.2.3 Issuing organization 7.5.2.4 Approval authority 7.5.2.5 Change history 7.5.3 Introduction 7.5.3.1 Scope 7.5.3.2 References |
45 | 7.5.3.3 Glossary 7.5.4 Detailed test data requirements 7.5.4.1 Overview 7.5.4.2 Unique identifier 7.5.4.3 Description 7.5.4.4 Responsibility 7.5.4.5 Period needed 7.5.4.6 Resetting needs 7.5.4.7 Archiving or disposal |
46 | 7.6 Test Environment Requirements 7.6.1 Overview 7.6.2 Document specific information 7.6.2.1 Overview 7.6.2.2 Unique identification of document 7.6.2.3 Issuing organization 7.6.2.4 Approval authority 7.6.2.5 Change history 7.6.3 Introduction |
47 | 7.6.3.1 Scope 7.6.3.2 References 7.6.3.3 Glossary 7.6.4 Detailed test environment requirements 7.6.4.1 Overview 7.6.4.2 Unique identifier |
48 | 7.6.4.3 Description 7.6.4.4 Responsibility 7.6.4.5 Period needed 7.7 Test Data Readiness Report 7.7.1 Overview 7.7.2 Document specific information 7.7.2.1 Overview 7.7.2.2 Unique identification of document 7.7.2.3 Issuing organization 7.7.2.4 Approval authority 7.7.2.5 Change history |
49 | 7.7.3 Introduction 7.7.3.1 Scope 7.7.3.2 References 7.7.3.3 Glossary 7.7.4 Test data status 7.7.4.1 Overview 7.7.4.2 Unique identifier 7.7.4.3 Description of status 7.8 Test Environment Readiness Report 7.8.1 Overview |
50 | 7.8.2 Document specific information 7.8.2.1 Overview 7.8.2.2 Unique identification of document 7.8.2.3 Issuing organization 7.8.2.4 Approval authority 7.8.2.5 Change history 7.8.3 Introduction 7.8.3.1 Scope 7.8.3.2 References 7.8.3.3 Glossary |
51 | 7.8.4 Test environment readiness 7.8.4.1 Overview 7.8.4.2 Unique identifier 7.8.4.3 Description of status 7.9 Actual Results 7.10 Test Result |
52 | 7.11 Test Execution Log 7.11.1 Overview 7.11.2 Document specific information 7.11.2.1 Overview 7.11.2.2 Unique identification of document 7.11.2.3 Issuing organization 7.11.2.4 Approval authority 7.11.2.5 Change history 7.11.3 Introduction 7.11.3.1 Scope |
53 | 7.11.3.2 References 7.11.3.3 Glossary 7.11.4 Events 7.11.4.1 Overview 7.11.4.2 Unique identifier 7.11.4.3 Time 7.11.4.4 Description 7.11.4.5 Impact 7.12 Test Incident Reporting 7.12.1 Overview |
54 | 7.12.2 Incident Report 7.12.3 Document specific information 7.12.3.1 Overview 7.12.3.2 Unique identification of document 7.12.3.3 Issuing organization 7.12.3.4 Approval authority 7.12.3.5 Change history 7.12.4 Introduction |
55 | 7.12.4.1 Scope 7.12.4.2 References 7.12.4.3 Glossary 7.12.5 Incident details 7.12.5.1 Timing information 7.12.5.2 Originator 7.12.5.3 Context 7.12.5.4 Description of the incident |
56 | 7.12.5.5 Originator’s assessment of severity 7.12.5.6 Originator’s assessment of priority 7.12.5.7 Risk 7.12.5.8 Status of the incident |
57 | Annex A (informative) Overview and Outlines of Documents |
66 | Annex B (informative) ISO/IEC/IEEE 29119-2 Normative Requirements Mapped to ISO/IEC/IEEE 29119-3 Information Items |
71 | Annex C (informative) Overview of Examples |
73 | Annex D (informative) Test Policy |
75 | Annex E (informative) Organizational Test Stategy |
80 | Annex F (informative) Test Plan |
93 | Annex G (informative) Test Status Report |
96 | Annex H (informative) Test Completion Report |
99 | Annex I (informative) Test Design Specification |
107 | Annex J (informative) Test Case Specification |
112 | Annex K (informative) Test Procedure Specification |
115 | Annex L (informative) Test Data Requirements |
117 | Annex M (informative) Test Environment Requirements |
119 | Annex N (informative) Test Data Readiness Report |
120 | Annex O (informative) Test Environment Readiness Report |
121 | Annex P (informative) Actual Results |
123 | Annex Q (informative) Test Result |
125 | Annex R (informative) Test Execution Log |
126 | Annex S (informative) Incident Report |
128 | Annex T (informative) Mappings to Existing Standards |
135 | Bibliography |
137 | IEEE Notice to Users IEEE Particpants download |