ISO 42001 Requirements: Clauses 4-10 Explained
ISO 42001 follows the ISO High-Level Structure (HLS), making it compatible with other management system standards like ISO 27001. This guide explains the mandatory requirements in Clauses 4-10.
Key Takeaways
| Point | Summary |
|---|---|
| Structure | 7 core clauses (4-10) following ISO High-Level Structure |
| Clause 4 | Context - Understand your organization and define AIMS scope |
| Clause 5 | Leadership - Management commitment and AI policy |
| Clause 6 | Planning - Risk assessment and AI objectives |
| Clause 7 | Support - Resources, competence, documentation |
| Clause 8 | Operation - AI risk assessment, impact assessment, life cycle |
| Clause 9 | Performance evaluation - Monitoring, audit, management review |
| Clause 10 | Improvement - Nonconformity and continual improvement |
Quick Answer: ISO 42001 has seven mandatory clauses (4-10) that every certified organization must address. These cover context, leadership, planning, support, operations, performance evaluation, and improvement. The clauses integrate AI-specific requirements with the standard management system framework.
Clause Structure Overview
ISO 42001 Clause Structure
────────────────────────────────────────────────────
Clause 4: Context of the organization
├── 4.1 Understanding the organization and its context
├── 4.2 Understanding needs and expectations of interested parties
├── 4.3 Determining scope of the AIMS
└── 4.4 AI management system
Clause 5: Leadership
├── 5.1 Leadership and commitment
├── 5.2 AI policy
└── 5.3 Organizational roles, responsibilities, authorities
Clause 6: Planning
├── 6.1 Actions to address risks and opportunities
└── 6.2 AI objectives and planning to achieve them
Clause 7: Support
├── 7.1 Resources
├── 7.2 Competence
├── 7.3 Awareness
├── 7.4 Communication
└── 7.5 Documented information
Clause 8: Operation
├── 8.1 Operational planning and control
├── 8.2 AI risk assessment
├── 8.3 AI risk treatment
└── 8.4 AI system impact assessment
Clause 9: Performance evaluation
├── 9.1 Monitoring, measurement, analysis, evaluation
├── 9.2 Internal audit
└── 9.3 Management review
Clause 10: Improvement
├── 10.1 Continual improvement
└── 10.2 Nonconformity and corrective action
Clause 4: Context of the Organization
Understanding your organization's context is the foundation of your AIMS.
4.1 Understanding the Organization and Its Context
Requirement: Determine external and internal issues relevant to the organization's purpose and affecting AIMS outcomes.
| Issue Type | Examples |
|---|---|
| External issues | AI regulations (EU AI Act), market expectations, technology trends, competitor landscape |
| Internal issues | Organizational culture, AI capabilities, risk appetite, existing management systems |
AI-specific considerations:
- Regulatory environment for AI (current and emerging)
- Customer expectations for AI governance
- Industry standards and best practices
- Ethical and societal expectations for AI
- Technology evolution affecting AI systems
4.2 Interested Parties
Requirement: Determine interested parties relevant to the AIMS and their requirements.
| Interested Party | Typical Requirements |
|---|---|
| Customers | Trustworthy AI, transparency, compliance |
| Regulators | Legal compliance, documentation, reporting |
| Employees | Training, clear roles, ethical guidelines |
| AI subjects | Fairness, privacy, recourse |
| Shareholders/investors | Risk management, value protection |
| Society | Responsible AI, environmental consideration |
4.3 Determining the Scope
Requirement: Determine boundaries and applicability of the AIMS to establish its scope.
Scope considerations:
- Which AI systems are included
- Which organizational units are covered
- Which life cycle stages are addressed
- Physical and logical boundaries
- Interfaces and dependencies
Example scope statement:
"The AI Management System covers the development, deployment, and operation of AI-powered analytics services provided by [Company Name]. This includes all AI systems developed by the Engineering and Data Science teams, cloud-based infrastructure, and customer-facing AI features. The scope covers the entire AI system life cycle from design through operation."
4.4 AI Management System
Requirement: Establish, implement, maintain, and continually improve an AIMS in accordance with the requirements of this document.
This clause requires you to:
- Determine necessary processes
- Determine process interactions
- Implement and maintain the AIMS
- Continually improve the AIMS
Clause 5: Leadership
Management commitment is essential for AIMS success.
5.1 Leadership and Commitment
Requirement: Top management shall demonstrate leadership and commitment to the AIMS.
Top management responsibilities:
| Responsibility | Evidence |
|---|---|
| Ensure AI policy and objectives are established | Documented policy, measurable objectives |
| Ensure AIMS integration with business processes | AIMS embedded in operations |
| Ensure adequate resources | Budget, personnel, tools |
| Communicate importance of AI management | Communications, training |
| Ensure AIMS achieves intended outcomes | Performance monitoring |
| Direct and support personnel | Active involvement |
| Support other managers | Enable their leadership |
| Promote continual improvement | Improvement initiatives |
5.2 AI Policy
Requirement: Top management shall establish an AI policy that:
- Is appropriate to the organization's purpose
- Provides a framework for setting AI objectives
- Includes commitment to satisfy applicable requirements
- Includes commitment to continual improvement
AI policy content:
| Element | Description |
|---|---|
| Purpose statement | Why the organization manages AI responsibly |
| Scope | What the policy covers |
| Principles | Core values for AI management |
| Commitments | Specific commitments (compliance, ethics, improvement) |
| Objectives framework | How AI objectives are set |
| Responsibilities | Key roles and accountability |
| Communication | How policy is communicated |
| Review | When policy is reviewed |
5.3 Organizational Roles, Responsibilities and Authorities
Requirement: Top management shall ensure responsibilities and authorities are assigned and communicated.
Key roles to define:
| Role | Responsibilities |
|---|---|
| AIMS Owner | Overall AIMS effectiveness and reporting |
| AI System Owners | Specific AI system management |
| Risk Owners | AI risk management and treatment |
| Data Stewards | Data quality and governance |
| Control Owners | Specific control implementation |
Clause 6: Planning
Planning addresses risks, opportunities, and objectives.
6.1 Actions to Address Risks and Opportunities
Requirement: When planning, consider issues from 4.1 and requirements from 4.2 to determine risks and opportunities that need to be addressed.
6.1.1 General
Plan actions to:
- Ensure AIMS achieves intended outcomes
- Prevent or reduce undesired effects
- Achieve continual improvement
6.1.2 AI Risk Assessment
Define and apply an AI risk assessment process that:
- Establishes risk criteria
- Ensures repeatable and consistent results
- Identifies AI-related risks
- Analyzes and evaluates risks
Risk assessment scope:
| Risk Category | Examples |
|---|---|
| Technical risks | Model failure, security vulnerabilities, data quality |
| Ethical risks | Bias, fairness, privacy, transparency |
| Organizational risks | Compliance, reputation, liability |
| Societal risks | Discrimination, environmental impact |
6.1.3 AI Risk Treatment
Determine appropriate risk treatment:
| Option | When to Apply |
|---|---|
| Modify | Reduce risk through controls |
| Accept | Risk within tolerance |
| Avoid | Eliminate risk source |
| Share | Transfer risk (insurance, contracts) |
6.1.4 Objectives for AI Risk Treatment
Document objectives and plans for achieving them, including:
- What will be done
- Resources required
- Who is responsible
- Completion timeframe
- How results will be evaluated
6.2 AI Objectives and Planning
Requirement: Establish AI objectives at relevant functions, levels, and processes.
Objective characteristics:
| Characteristic | Description |
|---|---|
| Consistent | Aligned with AI policy |
| Measurable | Quantifiable where practical |
| Consider requirements | Address applicable requirements |
| Monitored | Tracked and reviewed |
| Communicated | Known to relevant parties |
| Updated | Kept current |
Example AI objectives:
| Objective | Measure |
|---|---|
| Ensure AI systems meet fairness requirements | Bias testing pass rate |
| Maintain AI system availability | Uptime percentage |
| Respond to AI incidents effectively | Incident response time |
| Ensure AI competence | Training completion rate |
Clause 7: Support
Resources and enablers for the AIMS.
7.1 Resources
Requirement: Determine and provide resources needed to establish, implement, maintain, and continually improve the AIMS.
| Resource Type | Examples |
|---|---|
| Human resources | AI specialists, compliance staff, management |
| Infrastructure | Computing resources, development tools, testing environments |
| Financial | Budget for implementation, tools, training, audits |
| Information | Standards, guidelines, threat intelligence |
7.2 Competence
Requirement: Determine necessary competence for personnel affecting AIMS performance, ensure competence through appropriate education, training, or experience, and retain evidence.
Competence areas:
| Area | Required For |
|---|---|
| AI/ML technical skills | AI developers, data scientists |
| Risk management | Risk owners, AIMS owner |
| Data governance | Data stewards, AI system owners |
| Responsible AI | All personnel involved with AI |
| AIMS requirements | AIMS owner, auditors |
7.3 Awareness
Requirement: Persons doing work under the organization's control shall be aware of:
- The AI policy
- Their contribution to AIMS effectiveness
- Benefits of improved AI management
- Implications of not conforming to AIMS requirements
7.4 Communication
Requirement: Determine internal and external communications relevant to the AIMS.
| Aspect | Consideration |
|---|---|
| What | Subject matter of communications |
| When | Timing and frequency |
| With whom | Recipients/audiences |
| How | Methods and channels |
7.5 Documented Information
Requirement: The AIMS shall include documented information required by the standard and determined by the organization as necessary for AIMS effectiveness.
Required documentation:
| Document | Clause Reference |
|---|---|
| AIMS scope | 4.3 |
| AI policy | 5.2 |
| AI risk assessment process and results | 6.1.2 |
| Risk treatment objectives | 6.1.3 |
| AI objectives | 6.2 |
| Statement of Applicability | Annex A |
| AI system impact assessment | 8.4 |
| Monitoring results | 9.1 |
| Internal audit program and results | 9.2 |
| Management review results | 9.3 |
| Nonconformities and corrective actions | 10.2 |
Clause 8: Operation
Core operational requirements for AI management.
8.1 Operational Planning and Control
Requirement: Plan, implement, and control processes needed to meet requirements and implement risk treatment actions.
Control activities:
- Establishing criteria for processes
- Implementing control of processes according to criteria
- Keeping documented information to demonstrate processes have been carried out as planned
- Controlling planned changes and reviewing unintended changes
8.2 AI Risk Assessment
Requirement: Perform AI risk assessments at planned intervals or when significant changes are proposed or occur.
When to reassess:
- New AI systems introduced
- Significant changes to existing systems
- New regulations or requirements
- After incidents
- At defined intervals (e.g., annually)
8.3 AI Risk Treatment
Requirement: Implement the AI risk treatment plan and retain documented information on results.
8.4 AI System Impact Assessment
Requirement: Assess the potential impact of AI systems on individuals and society.
Impact assessment process:
| Step | Activities |
|---|---|
| 1. Identify scope | AI system, affected parties, use contexts |
| 2. Identify impacts | Positive and negative effects |
| 3. Assess impacts | Likelihood, severity, affected groups |
| 4. Determine actions | Mitigation measures |
| 5. Document | Assessment results and decisions |
| 6. Review | Periodic reassessment |
Impact categories:
| Category | Considerations |
|---|---|
| Individual impacts | Rights, privacy, safety, wellbeing |
| Group impacts | Discrimination, fairness, access |
| Societal impacts | Environment, democracy, economy |
| Beneficial impacts | Efficiency, accessibility, innovation |
Clause 9: Performance Evaluation
Measuring and reviewing AIMS effectiveness.
9.1 Monitoring, Measurement, Analysis and Evaluation
Requirement: Determine what needs to be monitored and measured, methods, when to perform, and how to analyze results.
Monitoring areas:
| Area | Metrics |
|---|---|
| AIMS performance | Objective achievement, control effectiveness |
| AI system performance | Accuracy, fairness, availability |
| Risk management | Risk levels, treatment effectiveness |
| Incident management | Incident count, resolution time |
| Compliance | Regulatory conformance |
9.2 Internal Audit
Requirement: Conduct internal audits at planned intervals to verify AIMS conformity and effectiveness.
Internal audit requirements:
| Requirement | Details |
|---|---|
| Program | Planned audit schedule |
| Criteria | What to audit against |
| Scope | What areas to cover |
| Auditor selection | Objective and impartial auditors |
| Results | Report to management |
| Records | Retain audit evidence |
9.3 Management Review
Requirement: Top management shall review the AIMS at planned intervals.
Review inputs:
| Input | Content |
|---|---|
| Status of previous actions | Follow-up on prior decisions |
| Changes | Internal and external factors |
| Performance information | Nonconformities, monitoring, audits, objectives |
| Opportunities | Improvement suggestions |
Review outputs:
- Improvement opportunities
- Need for AIMS changes
- Resource needs
- Decisions on AI risk treatment
Clause 10: Improvement
Continual enhancement of the AIMS.
10.1 Continual Improvement
Requirement: Continually improve the suitability, adequacy, and effectiveness of the AIMS.
Improvement sources:
- Audit findings
- Management review decisions
- Monitoring results
- Incident analysis
- Stakeholder feedback
- Technology changes
- Best practice evolution
10.2 Nonconformity and Corrective Action
Requirement: When nonconformity occurs:
- React to the nonconformity
- Evaluate need for action to eliminate cause
- Implement corrective action
- Review effectiveness
- Make AIMS changes if necessary
Corrective action process:
Nonconformity and Corrective Action
────────────────────────────────────────────────────
1. Identify Nonconformity
└── What happened, where, impact
2. Immediate Response
└── Control and contain the issue
3. Root Cause Analysis
└── Why did it happen?
4. Corrective Action
└── Eliminate the cause
5. Effectiveness Review
└── Did it work?
6. AIMS Update
└── Prevent recurrence
Compliance Checklist
Mandatory Requirements
| Clause | Requirement | Evidence |
|---|---|---|
| 4.3 | AIMS scope documented | Scope document |
| 5.2 | AI policy established | Policy document |
| 6.1.2 | Risk assessment conducted | Risk register |
| 6.2 | AI objectives established | Objectives document |
| 7.5 | Documentation maintained | AIMS documentation |
| 8.4 | Impact assessment conducted | Assessment records |
| 9.2 | Internal audit conducted | Audit report |
| 9.3 | Management review conducted | Review minutes |
| 10.2 | Corrective action process | NCR records |
Need help implementing ISO 42001 requirements? Talk to our team
