AI Act "Certification": What It Actually Means (and What It Doesn't)
Not legal advice — consult qualified counsel. This article describes the certification, conformity-assessment, and standards landscape under the EU AI Act (Regulation (EU) 2024/1689) as understood from the published regulation, harmonized-standards work, and notified-body activity as of April 2026. The legal status of any particular vendor claim depends on facts not in evidence here.
There is no single "AI Act certified" stamp. There is no EU certificate that says "this AI system is approved." There is no central authority that hands out "AI Act certifications" to vendors who pass an exam.
Yet "AI Act certified" appears on vendor websites, on procurement decks, in LinkedIn bios, and in RFP responses across the EU market. Some uses are legitimate shorthand for things the regulation actually defines. Many are not. The distinction matters: a buyer who relies on the wrong meaning of "AI Act certified" can fail an audit despite having paid for a vendor who waved the term around.
This piece is a debunk. It separates the three things "AI Act certification" can legitimately refer to, from the three things it gets misused to mean. It ends with the questions a buyer should ask any vendor who uses the phrase, and the language a vendor should use in place of it.
Companion reading: /blog/eu-ai-act-business-guide, /blog/ai-act-compliance-software-guide, /blog/ai-compliance-checklist-2026.
What the AI Act Actually Says About Certification
The word "certification" appears in the EU AI Act, but it does not refer to a single thing. Three different mechanisms in the regulation produce evidence that a high-risk AI system meets the Act's requirements, and only one of them yields a third-party certificate. Understanding which is which is the foundation for everything else.
Article 43 — Conformity Assessment
Article 43 of Regulation (EU) 2024/1689 defines the conformity assessment procedures for high-risk AI systems listed in Annex III. Conformity assessment is the process by which a provider demonstrates that an AI system meets the essential requirements of Section 2 of Chapter III (Articles 8–15). Article 43(1) and (2) establish two principal routes:
- Internal control (Annex VI) — the provider demonstrates conformity itself, without the involvement of a notified body, for most Annex III high-risk systems.
- Notified-body involvement (Annex VII) — required for the specific case of biometric systems under Annex III.1 when no harmonized standards exist or are only partially applied, and in certain other cases the regulation specifies. Notified-body assessment produces an EU technical documentation assessment certificate.
For most Annex III high-risk systems — including those used in employment, credit scoring, education, and access to essential services — the default route is internal conformity assessment under Annex VI. There is no third-party certificate at the end. The provider issues an EU declaration of conformity (Article 47) and affixes the CE marking (Article 48). The "certification" is the declaration the provider itself issues, on its own legal responsibility.
This is the first place public discourse goes wrong: most "AI Act certifications" people imagine — an external auditor reviewing the system and issuing a certificate — do not exist for the majority of high-risk systems. The regulation explicitly chose internal control as the default, with notified-body involvement reserved for the highest-risk biometric subset.
Article 47 — EU Declaration of Conformity
Article 47 obliges the provider of a high-risk AI system to draw up a written, machine-readable, physical or electronically signed EU declaration of conformity for each AI system. The declaration must contain the elements set out in Annex V — including the provider's name, the system's identification, a statement of conformity with the Act, references to relevant harmonized standards or common specifications applied, and where applicable, the notified-body details.
The declaration of conformity is the document that a buyer can legitimately ask to see. It is the artifact closest in spirit to a "certificate" — but it is issued by the provider, not by a third party. The provider takes legal responsibility (Article 47(4)) for the system's conformity.
When a vendor says "we are AI Act compliant," the question to ask is: "Do you have an Article 47 EU declaration of conformity for the system you are selling me, and can I see it?" The answer separates vendors who have done the work from vendors who have done the marketing.
Article 48 — CE Marking
Article 48 governs the CE marking for high-risk AI systems. The CE marking is the graphical mark a provider affixes to a high-risk AI system (or its accompanying documentation, where the system has no physical form) to declare conformity with the AI Act and any other Union harmonization legislation that applies. Article 49 governs registration in the EU database.
The CE marking is not an AI-Act-specific certification. It is the same CE marking used across EU product legislation (medical devices, machinery, toys, radio equipment) to indicate conformity with applicable Union law. For a high-risk AI system, the CE marking signifies that the provider has issued an Article 47 declaration, completed the Article 43 conformity assessment, and registered the system per Article 49. It does not, on its own, mean a third party reviewed the system.
Article 40 — Harmonized Standards
Article 40 introduces the presumption of conformity mechanism familiar from other EU product legislation. AI systems that comply with harmonized standards published in the Official Journal of the European Union are presumed to conform with the requirements of Section 2 of Chapter III. Standardisation requests have been issued to CEN-CENELEC (notably under JTC 21), and harmonized standards are progressively being published.
Where harmonized standards do not yet exist, the Commission may adopt common specifications (Article 41) by implementing acts. Conformity with common specifications produces the same presumption of conformity.
A vendor that "complies with harmonized standard X" is making a precise, verifiable claim. A vendor that "is AI Act certified" is, in most cases, not making a precise claim — they are translating Article 40, 43, or 47 into an English shorthand that the regulation does not use. The translation is the source of the confusion.
Three Things "AI Act Certification" Legitimately Refers To
Stripped of marketing varnish, the phrase has three honest readings.
1. The notified-body certificate under Annex VII
For high-risk AI systems where Article 43 mandates notified-body involvement — primarily certain biometric identification and categorization systems where harmonized standards are not applied or only partially applied — the notified body issues a technical documentation assessment certificate at the end of the Annex VII procedure. This is a real, third-party certificate. It is bounded to the specific system and the specific intended purpose declared in the technical documentation.
A vendor that holds this certificate can legitimately say so. The right phrasing: "Our system X has been issued a technical documentation assessment certificate under Annex VII by Notified Body NB-XXXX." That is a fact, with verifiable content, scoped to a system.
2. The EU declaration of conformity per Article 47
For all high-risk AI systems, the provider issues an EU declaration of conformity. This is not a third-party certificate, but it is the legally operative document under the Act. A vendor that has issued a declaration can legitimately say: "We have issued an EU declaration of conformity under Article 47 for system X, dated DD-MM-YYYY, with CE marking applied per Article 48." That is also a fact, with a date and a system to anchor it.
3. Compliance with a published harmonized standard or ISO/IEC standard
A vendor that has implemented and operates an AI Management System aligned with ISO/IEC 42001:2024, or a system in conformity with a published harmonized standard listed in the OJEU under Article 40, can legitimately reference that. ISO/IEC 42001:2024 is not an AI Act certification — it is the international AI Management System standard. But certification under ISO 42001 by an accredited certification body is genuine third-party evidence that the management framework around the AI system meets a defined bar, and the AI Act expressly recognizes management-system rigor as a pathway to satisfying Article 17.
The right phrasing: "Our AI Management System is certified to ISO/IEC 42001:2024 by [accredited body], certificate number, expiry date." That is verifiable. It is not the same as "AI Act certified," and a careful vendor will not conflate them.
Three Things It Gets Misused to Mean
The same phrase shows up attached to three claims that have no basis in the regulation.
1. "We passed an AI Act audit, so we're certified."
There is no formal "AI Act audit" administered by a public authority that hands out a certificate as a result. National market-surveillance authorities can request information, conduct investigations, and order corrective action under Articles 74–79. A vendor can hire an independent assessor to review its conformity posture. Neither produces "AI Act certification." The AI Act's enforcement architecture is risk-based supervision, not periodic certification audits.
A vendor that says "we passed our AI Act audit" should be asked: who conducted it, what scope, what did the report find, and what is the document type the auditor issued. A clear answer points to a real engagement. A vague answer points to marketing.
2. "Our product is AI Act certified by the EU."
The EU does not certify AI products. The EU regulates AI providers and deployers; conformity assessment is a process under Article 43; certificates (where applicable) are issued by notified bodies designated by Member States, not by "the EU." For Annex VI internal-control conformity assessments — which cover most high-risk Annex III systems — there is no certificate at all, and certainly not one from the EU.
A vendor that says "EU-certified AI" is using marketing language the regulation does not support. The honest phrasing is "compliant with EU AI Act Article 47 declaration requirements" or, if applicable, "Annex VII certified by Notified Body NB-XXXX." Vendors comfortable with these phrasings tend to have done the work; vendors who default to "EU-certified" tend not to.
3. "We're AI Act certified because we use a certified model."
A foundation-model provider's compliance posture is not transferable to a downstream deployer's compliance posture. Under Article 25 of the AI Act, allocation of obligations along the AI value chain follows specific rules, and Article 53 obligations apply to GPAI model providers. A bank using OpenAI's GPT-4 in a credit-decisioning workflow does not become AI Act compliant because OpenAI complied with Article 53. The bank is the provider (if it puts the system on the market under its own name or makes a substantial modification per Article 25(1)) or the deployer (Article 26) of the resulting high-risk system, with its own Article 9 / 12 / 14 / 17 obligations to fulfill.
A vendor that says "we use [foundation model X] which is AI Act compliant, so we are too" is making a claim the regulation does not support. The honest claim is "we have completed our own conformity assessment for the AI system we sell, drawing on the GPAI provider's Article 53 documentation." That distinction is the difference between a defensible procurement story and a procurement red flag.
What Buyers Should Ask Vendors Instead
The phrase "are you AI Act certified?" is unhelpful at procurement because it admits a "yes" answer that means almost anything. Replace it with these five questions, in order.
1. Is the AI system you are selling me classified as high-risk under Annex III? If yes, under which sub-category?
A vendor that cannot state the Annex III sub-category cleanly has not completed the Article 6 classification work. If the answer is "no, it is not high-risk," ask for the Article 6 reasoning in writing — many systems sold as low-risk are actually borderline and the analysis matters at audit.
2. Have you issued an EU declaration of conformity under Article 47 for this system, and can I see it?
The declaration is the operative legal artifact. A vendor that has not issued one, for a high-risk system being placed on the EU market, is not in compliance.
3. Which harmonized standards or common specifications have you applied (Article 40 / 41), and where can I verify their listing in the OJEU?
Specific standards are verifiable. Generic "we follow industry best practices" is not.
4. If your system requires notified-body involvement under Article 43, what is the notified body, the certificate number, and the expiry date?
Most Annex III systems do not require notified-body involvement, but biometric ones often do. The vendor either has the certificate or does not.
5. What evidence do you produce on demand for Article 12 (logs), Article 14 (oversight), and Article 17 (QMS)? Can I see a sample?
Article 47 declarations are necessary but not sufficient. The ongoing operation of a high-risk system must produce continuous evidence, and that evidence is what an enforcement action will actually examine. A vendor whose Article 12 evidence is "we keep logs" without showing the format and granularity has not built the system the Act expects.
The Knowlee Compliance Hedge Rule
Internal compliance discipline shapes how Knowlee describes itself externally. The rule is simple: never claim "certified"; claim "compliant under Article X scope," with the article and scope stated in plain language.
In practice this looks like:
- We do not say "Knowlee is AI Act certified."
- We do say "Knowlee operates with AI Act Article 47 declaration discipline for systems classified as high-risk under Annex III, with audit trail per Article 12 implemented via JSONL streaming."
- We do say "Knowlee's AI Management System is aligned with ISO/IEC 42001:2024, with formal certification targeted for Q1 2027."
- We do not say "ISO 42001 certified" until the certification body has issued the certificate.
- We do say "compliant with Article X scope" when we have done the work for that article, naming the article.
The hedge is not just a marketing rule. It is the same rule the regulation expects of any provider issuing an Article 47 declaration: state precisely what you have done, against precisely which article, with evidence that supports the claim. Vendors that describe themselves with this discipline tend to be the vendors that survive enforcement. Vendors that lean on "AI Act certified" tend to discover the gap when a national authority actually asks.
This positions Knowlee alongside compliance-suite layers (Vanta, OneTrust, Drata) — those products certify management systems and policy controls; Knowlee provides the runtime evidence those policies declare. Both layers compose into a defensible stack. Neither layer alone produces a meaningful "AI Act certification" — because, as the regulation is currently structured, that single artifact does not exist for most high-risk systems. See /compare/knowlee-vs-vanta-onetrust for how the layers fit together.
When Will This Change?
The certification landscape will evolve. Three forces are already in motion in 2026.
Harmonized standards under Article 40 are progressively being published. As CEN-CENELEC JTC 21 outputs are listed in the OJEU, providers will increasingly have a defined technical bar to declare conformity against. This will not produce a single "AI Act certified" stamp, but it will tighten the meaning of "compliant with EN-XXXXX harmonized standard for risk management."
Voluntary codes of conduct under Article 95 may produce labels or marks that provide structured evidence of voluntary commitments going beyond the regulation's minimums. The Commission and stakeholders are exploring this. Voluntary codes are not certification, but they may produce reputational signals adjacent to it.
Notified-body capacity for the Annex VII pathway is still being built. As more notified bodies are designated by Member States and accredit their AI assessment practices, the Annex VII certificate will become a more common procurement artifact for biometric and other notified-body-required systems. The supply of these certificates is currently the bottleneck, not the demand.
In each of these directions, the underlying truth holds: the AI Act produces a portfolio of evidence types, not a single certification. Buyers who learn to ask for the specific evidence they need — declarations, harmonized-standard conformity, notified-body certificates where applicable, ISO 42001 certificates for the management system — will navigate the market with clarity. Buyers who hold out for a single "AI Act certified" stamp will keep being disappointed and keep being misled.
FAQ
Is "AI Act certified" a real thing?
Not as a single artifact. The EU AI Act produces a portfolio of evidence types: EU declaration of conformity (Article 47), CE marking (Article 48), notified-body technical documentation assessment certificates under Annex VII for specific cases, and presumption of conformity through harmonized standards (Article 40). None of these is a single "AI Act certified" stamp. Vendors that use the phrase are usually compressing one of these specific artifacts into informal language; some are using it without any of the underlying evidence.
Who issues AI Act certificates?
For Annex VII conformity-assessment routes, notified bodies designated by Member States issue technical documentation assessment certificates. There is no central EU agency that issues "AI Act certificates" directly. For Annex VI internal-control routes — which cover most high-risk Annex III systems — there is no third-party certificate at all; the provider issues an EU declaration of conformity on its own legal responsibility under Article 47.
Does CE marking on AI mean the same thing as CE marking on a kettle?
The framework is the same: CE marking signifies the manufacturer's declaration that the product complies with applicable Union harmonization legislation and any required conformity-assessment procedures have been completed. The substantive obligations differ — a kettle complies with the Low Voltage Directive and EMC Directive; a high-risk AI system complies with the AI Act's Articles 8–15 plus the QMS requirements of Article 17. But the legal mechanism — provider-issued declaration plus optional notified-body involvement plus a graphical mark — is shared across EU product law.
Is ISO 42001 the same as AI Act certification?
No. ISO/IEC 42001:2024 is the international standard for AI Management Systems. Certification to ISO 42001 by an accredited body is genuine third-party evidence that an organization's management framework around AI meets the standard's requirements. The AI Act references quality management systems in Article 17 and recognizes management-system rigor as part of conformity. ISO 42001 certification supports AI Act conformity; it does not substitute for the conformity assessment, declaration of conformity, or CE marking the AI Act itself requires.
Can a vendor become "AI Act certified" through a third-party auditor?
A vendor can engage a third-party assessor — an audit firm, a notified body, an independent compliance consultancy — to review its conformity posture. The output of such an engagement is whatever the engagement letter says it is: a readiness report, a gap analysis, a pre-audit assessment. None of these constitute "AI Act certification" in the sense of a regulatory artifact. Where the engagement is with a designated notified body for an Annex VII assessment, the resulting Annex VII certificate is a regulatory artifact — but it applies only to systems and pathways where Article 43 requires notified-body involvement.
What should I ask a vendor instead of "are you AI Act certified?"
Ask: "Have you issued an Article 47 declaration of conformity for this system?" — and ask to see it. Then ask which Annex III sub-category the system falls under, which harmonized standards under Article 40 (or common specifications under Article 41) the vendor has applied, and what evidence the vendor produces on demand for Articles 12 (logs), 14 (oversight), and 17 (QMS). These questions surface either real evidence or marketing — there is rarely a middle ground.
What if the vendor sells a tool for AI compliance, not an AI system?
A compliance tool is not, in general, itself a high-risk AI system under Annex III, so it does not need an Article 47 declaration of conformity. What the buyer should ask of a compliance-tool vendor is different: which AI Act articles does the tool help the buyer satisfy, what evidence does it produce, and how does that evidence satisfy the relevant article's acceptance criteria? See /blog/ai-act-compliance-software-guide for the buyer framework.
Will the EU eventually create a single "AI Act certification" stamp?
The Commission's current direction is the harmonized-standards pathway under Article 40, voluntary codes under Article 95, and continued notified-body designation. None of these is moving toward a single stamp. A unified certification scheme would require new EU legislation; that is not on the agenda as of April 2026. Buyers should plan for the portfolio-of-evidence model to remain the operative one through the medium term.
Related Reading
- /blog/eu-ai-act-business-guide — the foundational reference on the AI Act.
- /blog/ai-act-compliance-software-guide — the buyer's framework for compliance tooling.
- /blog/ai-compliance-checklist-2026 — the implementation checklist.
- /blog/ai-act-fines-explained — Article 99 fine structure.
- /blog/ai-conformity-assessment-framework — the CE-marking-equivalent process for high-risk AI.
- /blog/ai-audit-trail-implementation-guide — what an Article 12 audit trail must capture.
- /blog/iso-42001-implementation-guide — the management-system standard the Act references.
- /glossary/ai-act, /glossary/high-risk-ai-systems, /glossary/ai-conformity-assessment, /glossary/ai-act-compliance-tool.
- /compare/knowlee-vs-vanta-onetrust — composing runtime evidence with policy/control suites.