EU AI Act8 min read

EU AI Act and ISO 27001 / SOC 2 Alignment

Organizations with existing ISO 27001 or SOC 2 certifications have a significant advantage when approaching EU AI Act compliance. These frameworks share common principles around risk management, documentation, and governance that provide a foundation for AI-specific requirements.

Key Takeaways

Point Summary
Shared foundations Risk management, documentation, and governance overlap significantly
Not sufficient alone Existing certifications help but do not guarantee AI Act compliance
AI-specific gaps Conformity assessment, AI transparency, and technical safety testing are new
Efficient integration Extending existing frameworks is more efficient than building parallel systems
Complementary value The combination provides stronger overall governance

Quick Answer: ISO 27001 and SOC 2 provide valuable foundations for EU AI Act compliance, particularly around risk management, documentation, and governance. However, they do not cover AI-specific requirements like conformity assessment, AI transparency obligations, or technical safety testing. Organizations should extend existing frameworks rather than create separate AI compliance programs.

How Existing Frameworks Help

ISO 27001 Contribution

ISO 27001 establishes an Information Security Management System (ISMS) with practices that support AI Act compliance:

ISO 27001 Element AI Act Benefit
Risk assessment process Framework for AI risk management system
Risk treatment methodology Approach for AI risk mitigation
Documentation requirements Culture and templates for AI documentation
Internal audit program Process applicable to AI compliance verification
Management review Governance structure for AI oversight
Continual improvement Mechanism for AI compliance enhancement
Asset inventory Foundation for AI system inventory
Supplier relationships Controls for AI vendor management
Incident management Process adaptable for AI incidents
Competence requirements Framework for AI literacy requirements

SOC 2 Contribution

SOC 2 Trust Services Criteria provide controls relevant to AI Act compliance:

SOC 2 Criteria AI Act Benefit
CC1: Control Environment Foundation for AI governance
CC2: Communication and Information Supports transparency requirements
CC3: Risk Assessment Approach for AI risk identification
CC4: Monitoring Activities Basis for AI post-market monitoring
CC5: Control Activities Framework for AI controls
CC6: Access Controls Security for AI systems
CC7: System Operations Operational controls for AI
CC8: Change Management Process for AI system changes
CC9: Risk Mitigation Approach for AI risk response
Processing Integrity Supports accuracy requirements
Availability Supports robustness requirements

Gap Analysis: What Existing Frameworks Do Not Cover

While ISO 27001 and SOC 2 provide foundations, AI Act compliance requires additional elements:

AI-Specific Technical Requirements

AI Act Requirement Gap Explanation
AI risk classification Frameworks do not classify systems by AI-specific risk categories
Conformity assessment No equivalent pre-market assessment procedure
CE marking Not applicable to information security frameworks
Technical documentation for AI AI-specific documentation requirements go beyond general IT documentation
Data governance for training Focus on training data quality and representativeness is AI-specific
Accuracy testing AI model accuracy testing is not covered by security frameworks
Bias assessment Frameworks do not require bias or discrimination testing

AI-Specific Transparency

AI Act Requirement Gap Explanation
AI disclosure No requirement to disclose AI system use to individuals
Synthetic content labeling Frameworks do not address AI-generated content
Emotion recognition disclosure Not covered by security frameworks
High-risk AI user information Specific information requirements for high-risk AI users

AI-Specific Governance

AI Act Requirement Gap Explanation
Human oversight design Technical requirements for human control over AI
EU database registration No equivalent registration requirement
Post-market monitoring AI-specific ongoing monitoring obligations
Serious incident reporting Different from general security incident reporting

Integration Approach

Rather than building separate compliance programs, extend existing frameworks:

Extended Risk Management

Current Practice AI Act Extension
Information security risk assessment Add AI-specific risk factors and impact criteria
Risk register Include AI systems with AI-specific risk attributes
Risk treatment plans Add AI-specific mitigation measures
Risk acceptance Define AI risk acceptance criteria and thresholds

Practical implementation:

  • Add AI risk classification as a risk attribute
  • Include AI-specific threat scenarios (bias, accuracy degradation, adversarial attacks)
  • Define AI impact categories (fundamental rights, safety, discrimination)
  • Establish AI risk thresholds aligned with AI Act classifications

Extended Documentation

Current Practice AI Act Extension
Policy documentation Add AI-specific policies (AI ethics, transparency)
Procedure documentation Add AI development and deployment procedures
Technical documentation Add AI-specific elements (training data, model architecture)
Records retention Include AI-specific records (logs, assessments)

Practical implementation:

  • Create AI policy annex or standalone AI policy
  • Add AI system documentation template aligned with Annex IV requirements
  • Establish AI documentation lifecycle management
  • Define retention periods for AI-specific records

Extended Governance

Current Practice AI Act Extension
Management review Include AI compliance status
Internal audit Add AI compliance audit scope
Roles and responsibilities Define AI-specific responsibilities
Competence management Add AI literacy requirements

Practical implementation:

  • Add AI to management review agenda with specific AI metrics
  • Develop AI compliance audit checklist
  • Define AI governance roles (AI compliance lead, system owners)
  • Implement AI literacy training program

Extended Vendor Management

Current Practice AI Act Extension
Vendor assessment Add AI-specific assessment criteria
Contractual requirements Add AI Act contractual clauses
Ongoing monitoring Monitor AI vendor compliance
Incident coordination Establish AI incident coordination with vendors

Practical implementation:

  • Add AI provider assessment questionnaire
  • Include AI Act compliance warranties in contracts
  • Request conformity assessment evidence from high-risk AI providers
  • Define AI incident notification and response protocols

Mapping Frameworks to AI Act

ISO 27001 to AI Act Mapping

ISO 27001 Control AI Act Requirement Gap
A.5.1 Policies General policy framework Add AI-specific policies
A.5.9 Asset inventory AI system inventory Add AI classification attributes
A.5.23 Supplier relationships AI vendor management Add AI-specific requirements
A.5.24 Incident management AI incident handling Add AI-specific criteria
A.6.3 Awareness and training AI literacy Add AI-specific training
A.8.1 User endpoint devices Human oversight interfaces Add AI-specific interface requirements
A.8.9 Configuration management AI system configuration Add AI-specific documentation
A.8.16 Monitoring activities Post-market monitoring Add AI-specific monitoring
A.8.32 Change management AI system changes Add re-classification assessment

SOC 2 to AI Act Mapping

SOC 2 Control Point AI Act Requirement Gap
CC1.1 Entity commitment AI governance commitment Add explicit AI governance
CC2.2 Internal communication AI transparency Add AI disclosure requirements
CC3.1 Risk identification AI risk classification Add AI-specific risk criteria
CC3.4 Risk assessment AI risk management system Add AI-specific methodology
CC4.2 Ongoing evaluations Post-market monitoring Add AI-specific monitoring
CC5.2 Control selection AI controls Add AI-specific controls
CC7.2 System monitoring AI operation monitoring Add AI behavior monitoring
CC8.1 Change authorization AI change assessment Add re-classification triggers
PI1.1 Data quality Training data governance Add AI-specific data requirements

Benefits of Integration

Extending existing frameworks rather than building separate programs offers several advantages:

Benefit Explanation
Efficiency Leverage existing processes, documentation, and governance
Consistency Single integrated approach rather than competing systems
Resource optimization Use existing teams and tools
Reduced complexity One framework to maintain rather than multiple
Audit efficiency Integrated audits covering multiple requirements
Stakeholder clarity Clear ownership and accountability

Implementation Considerations

For Organizations with ISO 27001

Step Action
1 Add AI to scope of ISMS
2 Extend risk assessment methodology for AI
3 Add AI controls to Statement of Applicability
4 Update documentation to include AI elements
5 Add AI to internal audit program
6 Include AI in management review
7 Consider ISO 42001 for comprehensive AI management

For Organizations with SOC 2

Step Action
1 Add AI systems to scope
2 Extend control descriptions to cover AI
3 Add AI-specific policies and procedures
4 Include AI in testing procedures
5 Update system description for AI elements
6 Coordinate with auditor on AI scope

For Organizations with Both

Step Action
1 Identify overlapping AI requirements across frameworks
2 Create unified AI governance approach
3 Develop integrated AI documentation
4 Establish single AI risk assessment process
5 Coordinate audit timing and scope

ISO 42001: The AI-Specific Standard

ISO 42001 provides a comprehensive AI management system framework that complements ISO 27001 and aligns well with EU AI Act requirements:

ISO 42001 Element AI Act Alignment
AI policy and objectives Governance requirements
AI risk assessment Risk management system
AI impact assessment Fundamental rights assessment
AI system lifecycle management Documentation requirements
Data management for AI Data governance requirements
AI transparency Transparency obligations
Third-party relationships Vendor management
Monitoring and measurement Post-market monitoring

Organizations serious about AI governance may consider ISO 42001 certification as a comprehensive approach that supports AI Act compliance.

How Bastion Helps

Bastion helps organizations leverage existing compliance programs for AI Act compliance:

  • Framework assessment. We evaluate your current ISO 27001 or SOC 2 implementation against AI Act requirements.
  • Gap identification. We identify specific AI-related gaps in your existing frameworks.
  • Integration planning. We design an efficient approach to extend existing frameworks for AI compliance.
  • Documentation support. We help update policies, procedures, and documentation for AI requirements.
  • Audit coordination. We help prepare for integrated audits covering multiple frameworks.
  • ISO 42001 readiness. We assess readiness for ISO 42001 certification if desired.

Our experience with ISO 27001 and SOC 2 compliance informs our approach to AI Act integration.


Ready to extend your existing compliance programs for the AI Act? Talk to our team


Sources