Helping Great Companies Get Better at Compliance

Course Overview

If your organization uses AI tools, especially ones made by other companies, you have important responsibilities. This course explains what those responsibilities are and how to handle them in a clear, simple way. 

You’ll learn how to check if an AI system is safe to use, what to look out for, and how to make sure people stay involved in important decisions. We use real-life examples to show how this works in practice. For example, if you’re using AI to screen job applications or help customers, we’ll show you how to keep things fair, clear, and under control. 

The course takes you through what to do before using an AI tool, how to make sure it stays safe and works well, and how to deal with any problems that come up. You'll also learn how to follow the rules without making things too complicated. 

To make things easier, the course includes ready-to-use templates and checklists that help you put what you learn into action. By the end, you’ll know how to use AI responsibly and follow rules with confidence.

Who Is This For?

This course is designed for anyone responsible for selecting, integrating, using, or overseeing AI systems in an organization, especially those sourced from third parties:

  1. Chief Compliance Officers (CCOs) – Ensuring organizational compliance with AI-related legal obligations.
  2. Chief Information Officers (CIOs) and CTOs – Overseeing the technical deployment of AI systems and managing associated risks.
  3. Data Protection Officers (DPOs) – Aligning AI deployment with GDPR and fundamental rights considerations.
  4. HR Managers – Using AI in recruitment, performance management, or workforce analytics.
  5. Marketing & Customer Experience Teams – Implementing AI-driven personalization, chatbots, or recommendation engines.
  6. Product Managers – Integrating third-party AI capabilities into digital products or services.
  7. Procurement Specialists – Assessing the risk and compliance posture of AI vendors and contractors.
  8. Operations Managers – Using AI to automate decisions or processes in manufacturing, logistics, or supply chain operations.
  9. Legal Counsel & Regulatory Affairs Teams – Reviewing contracts, disclosures, and risk assessments under the AI Act.
  10. AI Ethics & Governance Leads – Defining internal policy and oversight mechanisms for responsible AI use.

Cybersecurity & IT Risk Professionals – Evaluating the security posture and resilience of AI systems under deployment.


Modules

  • Deployer Obligations under the AI Act: Learn your key responsibilities when using AI systems, including transparency, risk management, and oversight. This module covers how to work with third-party providers and ensure your deployments meet EU legal and ethical standards.

Lessons

  1. Chapter 1

    Poglavlje 1

    AI User Obligations Under the AI Act - Introduction

  2. Chapter 2

    Poglavlje 2

    AI User Obligations Under the AI Act

Why Register?

  • Understand your legal duties : Learn what the EU AI Act requires from deployers and how to meet those obligations.

  • Manage third-party AI risks : Gain tools to assess and document compliance when using external AI systems.

  • Apply real-world lessons : Explore practical case studies on responsible AI use and risk management.

  • Advance your career : Earn a certification that demonstrates your expertise in AI deployment and regulatory compliance.

Reach your full potential.