ISO 13485 clause 7 and ISO 13485 clause 7.3: evidence, scope boundaries, and how auditors sample
If you’re implementing ISO 13485 clause 7, you’re building the “how we realize product” engine. If you’re implementing ISO 13485 clause 7.3, you’re building the “how we control design decisions and prove they’re right” engine. Auditors don’t treat these as separate silos—they sample across them to confirm your design work turns into controlled, repeatable production outputs (and that changes don’t bypass controls).
This article is a practical, evidence-first guide to how Clause 7 (Product Realization) relates to Clause 7.3 (Design & Development), with clear boundaries: what belongs inside 7.3, what belongs elsewhere, and how auditors typically test traceability. For clause navigation, keep these pages open: ISO 13485 Clause 7 – Product Realization and ISO 13485 Clause 7.3 – Design & Development. If you want the full clause map, start at the ISO 13485 Clauses 4–8 Clause Hub.
How auditors think about Clause 7 vs 7.3 (the sampling reality)
Auditors usually sample one product family and one meaningful change (a design change, supplier change, labeling change, software update, or manufacturing process change). Then they “trace the chain”:
- 7.3 chain (design evidence): plan → inputs → outputs → review → verification → validation → transfer → controlled changes → DHF integrity
- Clause 7 chain (realization evidence beyond 7.3): production planning, purchasing controls, production/service controls, monitoring/measurement resources, process validation where applicable, traceability/identification where required
The fastest way to fail is to blur the boundaries: using purchasing or production records as a substitute for design evidence (or vice versa). The fastest way to pass is to keep your DHF “spine” tight and show how design outputs become released production controls without losing traceability.
Scope boundary: what is inside 7.3 vs outside (Clause 7 but not 7.3)
This is the practical boundary auditors expect you to understand. It’s not about writing a full QMS—it’s about putting the right evidence in the right place, so sampling is clean.
Inside 7.3 (Design & Development Controls)
7.3 evidence is about design decisions and proof. These are the items auditors expect to find inside the DHF (or an equivalent design file):
- Design planning: phases, deliverables, responsibilities, interfaces, and review gates
- Design inputs: measurable requirements with sources and acceptance criteria
- Design outputs: specifications, drawings, software requirements, labeling drafts, and test methods that realize inputs
- Design reviews: documented decisions, attendees/roles, actions, and closure evidence
- Verification: objective evidence outputs meet inputs (requirement-to-test traceability)
- Validation: objective evidence the device meets intended use/user needs in representative conditions
- Design transfer: controlled hand-off to production/service with a release package
- Design changes: controlled change records, impact assessment, and retest/revalidation decisions
- DHF integrity: index, version control, retrieval, completeness checks
Outside 7.3 (Clause 7 areas auditors won’t accept as “design evidence”)
These are still Clause 7 responsibilities, but they don’t replace 7.3 controls. They typically live in production/purchasing/operations evidence sets:
- Purchasing controls (supplier evaluation, purchasing data, incoming acceptance)
- Production/service provision controls (work instructions, process controls, acceptance activities)
- Process validation for special processes where output can’t be fully verified later
- Identification/traceability controls (where required by your device/regulatory context)
- Monitoring and measurement resources (equipment control, calibration, measurement capability)
- Production release and distribution controls
Why the boundary matters: auditors won’t accept “we tested it in production” as a substitute for design verification. Likewise, they won’t accept “the drawing exists in DHF” as proof the shop floor is controlled—those controls must be released and governed where production evidence lives.
Minimum evidence set for 7.3 (DHF essentials)
If you only build one thing, build this: a DHF that supports auditor sampling. Here’s the minimum evidence set most teams need for a defensible 7.3 implementation.
- DHF Index / Design File Register (record IDs, versions, owners, links/paths)
- Design & Development Plan (phases, deliverables, interfaces, gates, update triggers)
- Design Inputs List (measurable requirements with acceptance criteria + sources)
- Design Outputs Set (specifications/drawings/software requirements/labeling + released test methods)
- DI→DO Traceability (every input implemented by at least one output)
- Design Review Records (decision + rationale + actions + closure evidence)
- Verification Plan & Matrix (DI→test/inspection/analysis method→report)
- Verification Reports (results, deviations, conclusions, approvals)
- Validation Plan & Evidence (intended use/user profile; representative conditions; outcomes)
- Design Transfer Record (release package checklist; readiness; sign-off)
- Design Change Records (impact assessment; retest/revalidation decisions; release)
Practical tip: If you cannot retrieve these within 10 minutes, your “system” is not auditor-ready even if the work was done. Fix retrieval by tightening your DHF index and standardizing naming + IDs.
Traceability walkthrough example (input → output → review → verification → validation → transfer → change)
Below is a realistic walkthrough that shows how Clause 7.3 evidence connects to Clause 7 evidence without blurring boundaries.
Fictional device scenario
Device: “PulseSense Home” — a handheld pulse oximeter intended for home monitoring.
Change request: CR-008 — replace the display module due to supplier end-of-life (slightly different brightness and power consumption).
1) Input (7.3 design input)
DI-014: Display must remain readable under typical indoor lighting and meet minimum brightness threshold at specified battery condition.
Source: user need + usability considerations + risk-driven visibility requirement.
2) Output (7.3 design output)
DO-022: Updated display specification and interface requirements; updated BOM; updated power budget; updated labeling note if necessary.
Where it lives: DHF as controlled outputs (spec + BOM revision evidence).
3) Review (7.3 design review)
Design Review DR-02 (Change Gate): Engineering + QA/RA + Production representation.
Decision: approve display replacement with defined re-verification scope and targeted validation impact check.
Recorded actions:
- Execute verification test case TV-014 against DI-014 brightness threshold and response time
- Assess labeling impact and update IFU draft if needed
- Confirm transfer package updates for production (released BOM + inspection criteria)
4) Verification (7.3 verification)
VER Report VR-014: bench testing confirms brightness threshold at defined battery condition and confirms display response time meets acceptance criteria. Deviations (if any) are documented with disposition.
5) Validation (7.3 validation)
VAL Summary VS-02: targeted simulated-use check with representative users confirms readability in home-use context; if a usability issue emerges (e.g., glare angle), the team updates design output and repeats the targeted check. Validation is purpose-based: it confirms intended-use performance, not just lab conformance.
6) Transfer (7.3 transfer + Clause 7 production control linkage)
Transfer Record TR-01: release package confirms that the updated BOM, work instruction references, and incoming inspection criteria are released to production controls.
Key boundary: The DHF proves the design decision and evidence. The shop-floor control of the updated BOM/work instruction lives in your production document control and production control system (Clause 7 outside 7.3).
7) Change (7.3 change control + Clause 7 purchasing/production impacts)
Design Change DC-008: impact assessment includes: requirements impacted (DI-014), risks impacted (readability-related hazard scenario), verification scope (TV-014), validation impact (targeted simulated-use), purchasing impact (supplier qualification and purchasing data updates), and production impact (incoming inspection updates).
Boundary clarity: supplier evaluation/incoming acceptance evidence is purchasing/production evidence (Clause 7), referenced by the change record but not treated as a substitute for design V&V.
If your traceability chain looks like this on one sample change, auditors gain confidence fast. If it breaks—missing IDs, missing decisions, tests not linked to requirements, unclear validation rationale—auditors typically expand sampling.
Common confusion patterns (Clause 7 vs 7.3) and how to fix them
1) “We verified it in production” (production testing used as design verification)
Why it fails: production tests are typically acceptance activities, not requirement-driven design verification with controlled protocols and traceability to design inputs.
Fix: create a verification matrix keyed to design input IDs; ensure test methods and reports are controlled and referenced in the DHF.
2) “Validation is just the last verification test”
Why it fails: verification proves conformance to requirements; validation confirms intended use/user needs under representative conditions.
Fix: define intended use + user profile in the validation plan; choose validation methods that represent actual use (simulated-use/usability/performance validation as appropriate).
3) “Design outputs are the CAD files” (outputs not supporting production + verification)
Why it fails: outputs must enable consistent production and objective verification; CAD alone often doesn’t define acceptance methods, labeling, software requirements, or inspection criteria.
Fix: define outputs as a released set: specs/drawings, BOM, labeling, test methods, software requirements (as applicable).
4) “The DHF is a folder dump” (no index, no retrieval control)
Why it fails: auditors need fast retrieval and version clarity; scattered records suggest the process isn’t controlled.
Fix: implement a DHF index with record IDs, versions, owners, and links/paths; enforce a release readiness checklist.
5) “Change control is separate from design controls”
Why it fails: design changes must trigger impact assessment and retest/revalidation decisions tied to design evidence.
Fix: require design change records for any output revision affecting performance/safety/labeling/software/suppliers; include explicit retest triggers.
6) “Clause 7.3 includes everything in Clause 7”
Why it fails: 7.3 is design control; other realization controls (purchasing, production, calibration) are distinct evidence sets.
Fix: keep boundaries clear: reference production/purchasing evidence in change impact assessments, but keep the design proof inside the DHF.
If you want a ready-to-use DHF structure that enforces these boundaries (DOCX + XLSX):
- ISO 13485 Clause 7.3 Design & Development Execution Pack — templates + evidence index + traceability matrices built for auditor sampling.
- ISO 13485 Internal Audit Execution & Defence Pack — run pre-audits using sampling logic and scripted checks.
FAQ (ISO 13485 clause 7 and ISO 13485 clause 7.3)
-
What is ISO 13485 clause 7 in practical terms?
Clause 7 is the set of controls that govern how you plan, realize, and control product—from design inputs through production/service execution—supported by controlled purchasing, production, and measurement resources. -
What is ISO 13485 clause 7.3 focused on?
Clause 7.3 is design & development control: planning, requirements, outputs, reviews, verification, validation, transfer, controlled changes, and DHF integrity. -
What’s the minimum DHF evidence auditors expect for 7.3?
A DHF index plus design plan, approved inputs/outputs, review records, verification matrix + reports, validation plan + evidence, transfer record, and design change records with impact assessment. -
How do auditors sample 7 and 7.3 together?
They pick a product or a significant change, trace the design evidence in the DHF, then confirm the released outputs are controlled in production/purchasing systems without uncontrolled drift. -
Where do calibration and measurement resources fit?
Typically outside 7.3: measurement resources support verification/production acceptance and must be controlled and traceable. Auditors may cross-check that your verification evidence relies on controlled measurement resources. -
What’s the most common confusion that triggers nonconformities?
Treating production testing as design verification, or treating late-stage verification as “validation” without intended-use justification. Fix it with purpose-based V&V and requirement-linked traceability.
Want “what good looks like” as filled, audit-ready examples?
Use the Filled Examples Library — Design & Development Audit (Clause 7.3) + CAPA Closure for a realistic DHF-style sample with findings and closure records you can model. If you need an end-to-end design controls operating system beyond 7.3, see Design Controls Under ISO 13485 — Full Execution System (DOCX + XLSX).