The AI Act – Europe’s new basic law for artificial intelligence
With the AI Act (Artificial Intelligence Act), the European Union has created a globally unique legal framework to regulate the use of artificial intelligence. The aim is to make AI systems safe, transparent and responsible.
Unlike the GDPR, the AI Act not only concerns data protection, but the entire AI value chain – from development to provision and use.
The legal framework is based on a risk-basedapproach that divides AI systems into four risk classes:
The four risk categories of the AI Act
- Unacceptable risk: Prohibited systems such as social scoring, hidden manipulation or biometric monitoring.
- High risk: Systems in the healthcare, justice or education sectors that are particularly heavily regulated.
- Limited risk: Systems with transparency obligations, e.g. AI chatbots or generative AI tools.
- Minimal risk: AI systems that do not pose any significant risks (e.g. spam filters, simple suggestion functions).
Why the AI Act is also relevant for domain resellers and agencies
Even if many people think that the AI Act only affects large tech companies, this is a misconception. Domain resellers and digital agencies are part of the digital supply chain – and are therefore directly or indirectly affected if they use or provide AI systems.
Where AI is used in practice
- AI-based domain suggestion services (“This domain is available – and how about …?”)
- Automated customer advice via chatbot or voice assistant
- Text and image generators for web and SEO content
- AI-supported CRM systems or support tools
- Automated security scanners for domains and websites
As soon as you use AI – whether on your website, in the backend or in customer communication – you are the responsible party within the meaning of the AI Act.
What obligations apply to domain resellers and agencies
The AI Act entails specific obligations – including for you as a service provider or tool user. The most important requirements at a glance:
1. get an overview of all AI systems
Carry out an inventory: Which tools, plugins or platforms do you use that have integrated AI functions?
Tip: Even providers who do not advertise with the term “AI” may have such functions integrated in the background.
2. fulfill transparency obligations
If AI is used, this must be clearly and comprehensibly indicated. Example: A chatbot must not appear to be a “real” employee if it responds automatically.
Concrete examples:
- “This system uses artificial intelligence.”
- “Answers are generated automatically.”
3. check contracts and tools
Even if you use third-party tools, YOU are responsible for ensuring that they are compliant with the AI Act. This applies above all:
- Terms of use
- Data protection regulations
- Providers outside the EU
4. document risks
You must document internally:
- Which AI systems are used
- For what purpose
- How transparent the use is
- What risks exist
- What risk mitigation measures have been taken
5. review processes regularly
AI tools are developing rapidly. It is therefore necessary to regularly check whether new functions are added that suddenly put you in a higher risk category – and thus trigger more obligations.
What happens if you do nothing?
The AI Act provides for severe fines of up to 30 million euros or 6% of annual turnover – whichever is higher.
Even if you as an agency “only” use an AI tool and do not develop it yourself, you can still be held responsible in an emergency – for example, if customers are inadequately informed about a system you have provided.
Recommendations for resellers and agencies
You should take action now so that you are not taken by surprise but can navigate the AI Act with confidence:
Your 5-step checklist for AI compliance
- Inventory tools: Identify all AI-supported systems
- Classify risks: Classify according to the AI Act risk classes
- Create transparency: Clearly inform users about the use of AI
- Check third-party providers: Only use compliant tools
- Establish documentation: For internal traceability and external audits
Conclusion – Not just legal stuff, but a competitive factor
The AI Act is not just an act of bureaucracy. It brings order to a previously largely unregulated area – and creates trust among customers and partners. Those who act early can even position themselves strategically:
- as a trustworthy provider
- as an AI-competent agency
- as a legally compliant reseller
In a world where trust is becoming increasingly important, AI compliance is not an optional extra, but a must.
Bonus: Frequently asked questions from resellers about the AI Act
As a reseller, do I have to provide proof of AI knowledge?
No, but you need to know whether and how AI is used in your systems. A basic technical understanding is usually enough – you don’t need a data science degree.
Does the AI Act also apply to US tools?
Yes, if the tools are used within the EU or affect EU citizens, they are subject to the AI Act – regardless of where they were developed.
What about generative tools like ChatGPT?
If you generate and publish content (texts, images, videos), you must make it transparent that AI was involved – e.g. on your website, in the legal notice or content notes.
Want to play it safe?
The AI Act is not a topic that should be taken lightly – but it is also no reason to panic. What’s important is that YOU now look at the tools and processes you use and understand where AI already plays a role – and where there may be a need for regulation.
Take the time to properly analyze your system, classify risks and create transparency. The following applies: anyone selling domains should also understand the rules of the game in the digital space – and this is exactly where the AI Act comes in.
Leave a Reply