Helping Great Companies Get Better at Compliance

Course Overview

If you build or provide AI systems, it's important to follow the rules, especially with the AI Act. This course gives you a clear, step-by-step guide to help you stay compliant and avoid legal problems. 

You’ll learn how to create AI systems that are safe, fair, and trustworthy. We’ll show you how to keep proper records, test your AI before releasing it, and keep an eye on how it performs after it's in use. This is especially important if your AI is used by other companies or in sensitive areas, such as hiring, healthcare, or public safety. 

The course uses real-life examples and practical guidance to help you understand what’s expected and how to meet those expectations. You won’t just learn what the rules say. You’ll learn how to apply them in your day-to-day work. 

You’ll also get access to downloadable templates and checklists that make it easier to do things the right way. These tools will help you follow each step, from development through deployment. By the end of the course, you’ll be able to build AI systems that work well, protect people, and follow the law.


Who Is This For?

This course is ideal for professionals involved in the development, commercialization, and support of AI systems, including:

  1. AI Product & Engineering Teams – Designing systems that meet transparency, safety, and accuracy requirements.
  2. Compliance & Risk Officers – Ensuring legal alignment throughout the AI lifecycle.
  3. R&D and Innovation Leads – Embedding compliance into early-stage system design and testing.
  4. Technical Documentation Specialists – Preparing user instructions, technical files, and conformity declarations.
  5. Legal & Regulatory Affairs Teams – Managing provider obligations, liability risks, and regulatory engagement.
  6. Quality Assurance & Testing Teams – Validating system performance, robustness, and ongoing compliance.
  7. Machine Learning Engineers & Data Scientists – Addressing bias, explainability, and data quality in model development.
  8. UX Designers – Designing AI interfaces that support human oversight and user understanding.
  9. Customer Success & Implementation Teams – Supporting safe and lawful deployment by clients or users.
  10. Sales & Commercial Teams – Understanding the compliance requirements that affect go-to-market strategies.
  11. Policy & Ethics Officers – Aligning AI practices with internal values and external regulatory expectations.
  12. Post-Market Surveillance Teams – Monitoring real-world use, handling incidents, and updating systems as needed.


Modules

  • Provider Obligations under the AI Act - Get a clear overview of your core responsibilities as an AI provider—including compliance with design, documentation, testing, and post-market obligations required under the EU AI Act.
  • Technical Documentation: Overview - Gain a clear understanding of the documentation required from AI providers under the EU AI Act, and how it supports compliance across the AI lifecycle.
  • Describing Your AI System - Learn how to clearly explain your system’s purpose, architecture, components, and how it is intended to function.
  • Documenting Your Design and Development - Record design decisions, training data, testing methods, and development steps to ensure transparency and traceability.
  • Monitoring, Functioning, and Control - Document how the system is monitored during operation, how it behaves in real-world use, and how human oversight is applied.
  • Instructions for Use - Develop clear, compliant user instructions covering system limitations, required conditions for use, and oversight expectations.
  • Your Risk Management System - Set up a process to identify, assess, and reduce potential harms, and keep detailed records throughout development and deployment.
  • Your AI Quality Management System - Implement and document procedures that ensure the AI system is consistently developed, tested, and maintained in line with regulatory standards.
  • Post-Market Monitoring -Track system performance after release, log incidents, and report serious risks. Ensure your system remains safe and compliant over time.

Lessons

  1. Chapter 1

    Poglavlje 1

    AI Step-by-Step - What Every Developer Needs to Know

  2. Chapter 2

    Poglavlje 2

    AI Act for Developers

Why Register?

  • Understand your legal obligations : Learn what the EU AI Act requires from providers, from technical documentation to post-market monitoring.

  • Build compliant AI systems : Gain the tools to embed safety, transparency and accountability into your AI products from the ground up.

  • Reduce regulatory risk : Avoid costly penalties by mastering provider responsibilities before systems go to market.

  • Strengthen market readiness : Equip your team to meet EU standards and compete confidently in regulated environments.

  • Advance your career : Earn certification that demonstrates your expertise in AI compliance and responsible innovation.

Reach your full potential.