...
ISTQB CTFL(v4) Chapter4

(1) Black-Box Test Techniques

Equivalence Partitioning (EP)

  • Divides data into partitions including:
  • Inputs, outputs, configuration items
  • Internal values, time-related values, interface parameters
  • Partition characteristics:
  • Continuous/discrete, ordered/unordered, finite/infinite
  • Non-overlapping and non-empty sets
  • Valid partition (valid values) vs invalid partition (invalid values)
  • Coverage formula: No. of partitions exercised by at least one test case Coverage = ------------------------------------------------------- × 100% Total number of identified partitions
text - 100% coverage requires exercising all partitions (including invalid)

Boundary Value Analysis (BVA)

  • Focuses on boundary values of ordered partitions
  • Boundary values: Minimum and maximum values of a partition
  • Approaches:
  • 2-value BVA: Boundary value + closest neighbor in adjacent partition
  • 3-value BVA: Boundary value + both neighbors
  • Coverage formula: No. of boundary values tested by at least one value Coverage = ------------------------------------------------------- × 100% Total number of boundary values
text

Decision Table Testing

  • Records complex logic/business rules
  • Defines conditions and resulting actions
  • Each column = decision rule
  • Coverage formula: No. of decision rules tested by at least one test case Coverage = ------------------------------------------------------- × 100% Total number of decision rules
text - 100% coverage requires testing all columns

State Transition Testing

  • Models system behavior through states and transitions
  • Transition syntax: event [guard condition] / action
  • Coverage types:
  1. All states coverageNo. of visited states Coverage = ---------------------------- × 100% total number of states
  2. Valid transitions coverage (0-switch)No. of exercised valid transitions Coverage = ------------------------------------ × 100% total number of valid transitions
  3. All transitions coverage (covers both valid/invalid transitions)
  • Full all transitions coverage guarantees full state and valid transition coverage

(2) White-Box Test Techniques

Statement Testing

  • Aims to exercise statements until acceptable coverage
  • Coverage formula: No. of statements exercised by test cases Coverage = ------------------------------------------------------ × 100% Total number of executable statements
text - 100% coverage = all executable statements exercised at least once - Does not guarantee full decision logic testing

Branch Testing

  • Aims to exercise branches until acceptable coverage
  • Coverage formula: No. of branches exercised by test cases Coverage = ------------------------------------------------- × 100% Total number of branches
text - 100% coverage = all branches exercised (conditional/unconditional) - Achieves 100% statement coverage (but not vice versa)

Value of White-box Testing

  • Applicable in static testing (reviewing non-executable code)
  • Measures actual code coverage (unlike black-box alone)

Collaboration-based Test Approaches

Collaborative User Story Writing

  • 3 C's:
  • Card: Medium describing user story
  • Conversation: Explanation of software usage
  • Confirmation: Acceptance criteria
  • User story format:
    "As a [role], I want [goal], so that [business value]"
  • INVEST criteria:
  • Independent, Negotiable, Valuable
  • Estimable, Small/Testable

Acceptance Criteria

  • Conditions for user story acceptance
  • Purposes:
  • Define story scope
  • Reach stakeholder consensus
  • Describe positive/negative scenarios
  • Basis for acceptance testing
  • Enable planning/estimation
  • Formats:
  • Scenario-oriented (Given/When/Then)
  • Rule-oriented (bullet points, input-output tables)

Acceptance Test-driven Development (ATDD)

  • Test-first approach
  • Test cases created before implementation by diverse team members
  • Execution: Manual or automated
  • Process:
  1. Implement positive test cases
  2. Perform negative testing
  3. Cover all user story characteristics

(3) Experience-based Test Techniques

Error Guessing

  • Anticipates errors based on tester knowledge:
  • Application history
  • Developer error patterns
  • Failures in similar applications

Fault Attacks

  • Methodical implementation of error guessing
  • Create/acquire lists of possible errors/defects
  • Design tests targeting specific defects
  • Sources: Experience, defect data, common knowledge

Exploratory Testing

  • Best used when:
  • Specifications are inadequate/absent
  • Under significant time pressure
  • Complementing formal techniques
  • Session-based approach:
  • Time-boxed execution
  • Guided by test charter (objectives)
  • Focus: Learning, deep exploration, untested areas

Checklist-based Testing

  • Tests cover conditions from predefined checklists
  • Checklist sources:
  • Experience
  • User importance
  • Failure understanding
  • Avoid in checklists:
  • Automatically verifiable items
  • Entry/exit criteria items
  • Overly general items
  • Characteristics:
  • Supports functional/non-functional testing
  • Provides guidelines without detailed test cases
  • High-level checklists: Greater coverage but less repeatability


Likes ( 0 ) comments ( 0 )
2025-07-30 12:59:15
Add comment