Sprinkling ActSprinkling Act
Sign In

Assess

Free DiagnosticGet your score instantlyPricing€690 report + enterprise plansFull ReportWhat your report containsQualifyApply for a full reportWaitlistReserve your report

Monitor

Compliance IndexPublic AI Act screening registryWhat-If EngineSimulate regulatory changesEnterpriseFull-portfolio AI Act monitoring & intelligence

Intelligence

SprinklingAct+Expert analysis updated weeklyReportsIndependent research on EU AI Act readiness

Methodology

MethodologyHow the scoring works — 6 gatesResourcesGuides, checklists & white papersAI PositiveEthical performance framework — beyond complianceAI AgentsThe 4 ACTS — AI Act for agent buildersReport an issueBug, typo, or data concern

About

AboutOur mission and standardWho Is This ForDecision-makers who act firstTransparencyWhat you gain — and what we cannot doWhat We Are NotThe lines we do not cross

Network

PartnersLaw firms, auditors & certification bodiesPress & MediaMedia kit, coverage, interview requestsContactGet in touch

CHOOSE YOUR REGION

International (English)FranceBelgiqueLuxembourgIreland
See all countries and regions →
SprinklingAct+

Analysis

Is my GPT/Claude wrapper a GPAI? And am I a provider or a deployer?

By Lamar B. Shucrani — April 10, 2026 · 11 min read

It's the question thousands of SaaS founders building on OpenAI, Anthropic or Google are asking themselves. The answer changes everything: €20K or €500K in compliance costs. Here's the rigorous, article-by-article analysis.

TL;DR

  • • Your wrapper is NOT a GPAI. The GPAI is the underlying model (GPT-4, Claude, Gemini). Not your product.
  • • You are probably a deployer of the model AND a provider of your own AI system.
  • • Your obligations depend on WHAT YOUR PRODUCT DOES, not on the underlying model. If your use case falls under Annex III, you're subject to the full high-risk regime.
  • • If you substantially modify the model (significant fine-tuning), you may become a provider of the modified GPAI — Art. 25.

1. What is a GPAI, precisely?

Article 3(63) of the AI Act defines a GPAI model (General Purpose AI) as an AI model trained on large amounts of data, designed to serve a variety of purposes, capable of performing a wide range of distinct tasks, and which can be integrated into downstream systems or applications.

Recital 97 clarifies: the relevant models are those with at least 10²³ FLOPs of training compute, capable of generating language (text or audio), text-to-image or text-to-video. This explicitly captures: GPT-4, GPT-3.5, Claude (all versions), Gemini (all versions), Llama, Mistral, and equivalent foundation models.

Key point: A « GPAI » is a MODEL, not a SYSTEM. Your product — which wraps that model in an interface, system prompts, guardrails, business logic — is not a GPAI. It is an « AI system » under Art. 3(1).

2. Provider or deployer? Both at once.

This is the most misunderstood point. When you build a SaaS product that calls GPT-4 via the OpenAI API, you simultaneously hold two distinct roles:

ROLE 1

Deployer of the GPAI model (OpenAI/Anthropic/Google)

You « use » the GPAI model within the meaning of Art. 3(4). The obligations of the model's provider (Art. 53: technical documentation, copyright policy, downstream transparency) are borne by OpenAI, not by you. You don't duplicate these obligations.

ROLE 2

Provider of your own AI system

You « place on the market or put into service » an AI system — yours, not OpenAI's. Art. 3(3) is clear: a « provider » is anyone who develops an AI system AND places it on the market under their own name or trademark. Your wrapper is your product. You are its provider.

Direct consequence: your obligations depend on what YOUR system does, not on what the underlying model is capable of. A wrapper that filters CVs is an Annex III high-risk system (§4(b), employment). A wrapper that generates cat images is not.

3. When do you become a provider of the GPAI model itself?

Article 25 of the AI Act defines responsibilities along the value chain. It specifies that if you substantially modify an AI system already on the market (including a GPAI), you become its provider for AI Act purposes. Concretely:

→

Prompt engineering only: Not a substantial modification. You remain a deployer of the model.

→

RAG / retrieval augmentation: No modification to the model itself. You remain a deployer.

→

Light fine-tuning (a few hundred examples): Grey area. Likely not substantial, but to be documented.

→

Significant fine-tuning (change of capabilities or behavior): You may become a provider of the modified model, with Art. 53 obligations on top of those of your AI system.

→

Training from scratch or massive continual pretraining: You are a GPAI provider. Full regime.

4. What this actually costs you

The range depends entirely on the regime you fall under. Here are the sourced orders of magnitude from the CEPS 2021 report and market estimates for deployers:

ProfileInitialAnnual
Simple wrapper, non-Annex III use case€0–3K<€5K
Annex III wrapper, self-assessment (Annex VI)€20–50K€10–30K
Annex III provider with full QMS€144–330K~€71K
GPAI provider (massive fine-tuning)€200–500K+€100K+

Important: if you already have GDPR maturity (DPO, records, DPIAs), your marginal AI Act cost drops 20–30%. The AI Act extends existing governance, it doesn't reinvent it.

5. What to do now

  1. 1. Clarify your role. Deployer of the model AND provider of your system, most likely. Document this position in writing — for yourself, your team, your investors.
  2. 2. Identify your use case in Annex III. If your product touches employment, credit, education, public services, justice, asylum, biometrics, or health — you may be high-risk.
  3. 3. Check Art. 5 (prohibited practices). Emotion recognition in workplaces, social scoring, criminal profiling are prohibited — regardless of the underlying model.
  4. 4. If you're limited risk: apply Art. 50. Transparency obligation — tell the user they're interacting with AI. Often that's all you need.
  5. 5. Document EVERYTHING. The regulator doesn't check what you did, they check what you can prove. A dated artefact of your position is worth a thousand conversations.

Building a product on GPT, Claude or Gemini? The free Sprinkling Act diagnostic tells you exactly which regime you're in — in 9 questions, 60 seconds, no account.

Free diagnostic — 9 questionsSee cost by profile

Sources

  1. [1]
    EUR-Lex (July 12, 2024) — Regulation (EU) 2024/1689 — Artificial Intelligence Act (full text) eur-lex.europa.eu/eli
  2. [2]
    EU AI Act — Article 3 — Definitions (AI system, GPAI model, provider, deployer) artificialintelligenceact.eu/article
  3. [3]
    EU AI Act — Article 16 — Obligations of providers of high-risk AI systems artificialintelligenceact.eu/article
  4. [4]
    EU AI Act — Article 25 — Responsibilities along the AI value chain artificialintelligenceact.eu/article
  5. [5]
    EU AI Act — Article 26 — Obligations of deployers of high-risk AI systems artificialintelligenceact.eu/article
  6. [6]
    EU AI Act — Article 50 — Transparency obligations for certain AI systems artificialintelligenceact.eu/article
  7. [7]
    EU AI Act — Article 53 — Obligations for providers of GPAI models artificialintelligenceact.eu/article
  8. [8]
    EU AI Act — Recital 97 — GPAI model definition and threshold artificialintelligenceact.eu/recital
  9. [9]
    CEPS (Laurer, Renda & Yeung) (September 2021) — Clarifying the Costs for the EU's AI Act www.ceps.eu/clarifying-the-costs-for-the-eus-ai-act
ALREADY ENFORCEABLE105 days

Art. 5 prohibitions and GPAI rules apply today. Transparency follows in 105 days. The question is not when — it’s whether you’ve documented your position.

Free Diagnostic — 9 questionsSee pricing →

Regulatory signals, when they happen.

AI Act updates, new analysis, enforcement news — delivered only when the regulation moves. No scheduled cadence.

Unsubscribe anytime. No third-party sharing.

SEE ALSO

Product

Free Diagnostic

Is your wrapper subject to the AI Act?

Blog

GPAI Obligations 2026

Art. 53-55 obligations in detail.

Blog

AI Act for SaaS

The 4 most common SaaS profiles.

Standard

Methodology

How we score AI Act position.

Product

Free DiagnosticPricingFull ReportReport PreviewQualifyWaitlistWhat-If EngineEnterpriseCompliance Index

Content

SprinklingAct+Research ReportsMethodologyResourcesAI PositiveReport an issue

Company

AboutWho Is This ForTransparencyWhat We Are NotPartnershipsPress & MediaContactLinkedIn

Legal

Legal NoticePrivacy PolicyCookie PolicyTerms of ServiceData ProcessingSecuritySources & ReferencesGlossaryOperator Charter

Copyright © 2026 Sprinkling Act. All rights reserved.

Ireland
Privacy Policy|Terms of Service|Cookie Policy|Security|x402 soon