What to Know
When Using AI
Includes downloadable templates
Helping Great Companies Get Better at Compliance
If you think only tech giants need to worry about the AI Act, think again. Under the new rules, any organisation using an AI system professionally is considered a “deployer”. Whether you use AI for CV screening, medical diagnostics, customer scoring or operational decision making, you face legal duties you might not even know exist.
This masterclass spells out those duties in practical terms. You’ll learn that you must ensure everyone using your AI understands how it works and its limits, that you can’t deviate from the provider’s operating instructions, and that you must set up real human oversight with trained personnel who can intervene when needed. We’ll show you how to tailor compliance to your organisation without watering down the standards.
You’ll see why you can’t simply “set it and forget it”: the law requires ongoing monitoring, immediate suspension following serious incidents, and log retention for at least six months. You must also tell your employees before rolling out high-risk AI, register systems used in the public sector, and be ready to cooperate with regulators who request documentation. If your system processes personal data, you’re required to complete a Data Protection Impact Assessment; we provide a template to get you started.
For businesses in sensitive sectors, such as healthcare, finance, or education, there’s an additional layer: a Fundamental Rights Impact Assessment, or FRIA, to map out potential harms and safeguards. You’ll leave with a FRIA template you can use immediately, along with examples showing how to adapt it to AI-driven tools.
The bottom line? Non-compliance can bring serious penalties, but understanding your obligations now will keep you ahead and protect your organisation. This training replaces confusion and guesswork with clear guidance, practical examples, and ready-to-use documents.
This course is designed for anyone responsible for selecting, integrating, using, or overseeing AI systems in an organization, especially those sourced from third parties:
Cybersecurity & IT Risk Professionals – Evaluating the security posture and resilience of AI systems under deployment.
Understand your legal duties : Learn what the EU AI Act requires from deployers and how to meet those obligations.
Manage third-party AI risks : Gain tools to assess and document compliance when using external AI systems.
Apply real-world lessons : Explore practical case studies on responsible AI use and risk management.
Advance your career : Earn a certification that demonstrates your expertise in AI deployment and regulatory compliance.